Multi-Modal Deep Learning for Weeds Detection in Wheat Field Based on RGB-D Images

被引:16
作者
Xu, Ke [1 ,2 ,3 ,4 ,5 ]
Zhu, Yan [1 ,2 ,3 ,4 ,5 ]
Cao, Weixing [1 ,2 ,3 ,4 ,5 ]
Jiang, Xiaoping [1 ,2 ,3 ,4 ,5 ]
Jiang, Zhijian [6 ]
Li, Shuailong [6 ]
Ni, Jun [1 ,2 ,3 ,4 ,5 ]
机构
[1] Nanjing Agr Univ, Coll Agr, Nanjing, Peoples R China
[2] Natl Engn & Technol Ctr Informat Agr, Nanjing, Peoples R China
[3] Minist Educ, Engn Res Ctr Smart Agr, Nanjing, Peoples R China
[4] Jiangsu Key Lab Informat Agr, Nanjing, Peoples R China
[5] Jiangsu Collaborat Innovat Ctr Technol & Applicat, Nanjing, Peoples R China
[6] Nanjing Agr Univ, Coll Artificial Intelligence, Nanjing, Peoples R China
来源
FRONTIERS IN PLANT SCIENCE | 2021年 / 12卷
基金
中国国家自然科学基金;
关键词
weeds detection; RGB-D image; multi-modal deep learning; machine learning; three-channel network; CROP; VISION; GROWTH; IMPACT; YIELD;
D O I
10.3389/fpls.2021.732968
中图分类号
Q94 [植物学];
学科分类号
071001 ;
摘要
Single-modal images carry limited information for features representation, and RGB images fail to detect grass weeds in wheat fields because of their similarity to wheat in shape. We propose a framework based on multi-modal information fusion for accurate detection of weeds in wheat fields in a natural environment, overcoming the limitation of single modality in weeds detection. Firstly, we recode the single-channel depth image into a new three-channel image like the structure of RGB image, which is suitable for feature extraction of convolutional neural network (CNN). Secondly, the multi-scale object detection is realized by fusing the feature maps output by different convolutional layers. The three-channel network structure is designed to take into account the independence of RGB and depth information, respectively, and the complementarity of multi-modal information, and the integrated learning is carried out by weight allocation at the decision level to realize the effective fusion of multi-modal information. The experimental results show that compared with the weed detection method based on RGB image, the accuracy of our method is significantly improved. Experiments with integrated learning shows that mean average precision (mAP) of 36.1% for grass weeds and 42.9% for broad-leaf weeds, and the overall detection precision, as indicated by intersection over ground truth (IoG), is 89.3%, with weights of RGB and depth images at alpha = 0.4 and beta = 0.3. The results suggest that our methods can accurately detect the dominant species of weeds in wheat fields, and that multi-modal fusion can effectively improve object detection performance.
引用
收藏
页数:10
相关论文
共 48 条
  • [1] Machine Learning for Smart Environments in B5G Networks: Connectivity and QoS
    Alsamhi, Saeed H.
    Almalki, Faris A.
    Al-Dois, Hatem
    Ben Othman, Soufiene
    Hassan, Jahan
    Hawbani, Ammar
    Sahal, Radyah
    Lee, Brian
    Saleh, Hager
    [J]. COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2021, 2021
  • [2] Deep Learning with Unsupervised Data Labeling for Weed Detection in Line Crops in UAV Images
    Bah, M. Dian
    Hafiane, Adel
    Canals, Raphael
    [J]. REMOTE SENSING, 2018, 10 (11)
  • [3] Weed segmentation using texture features extracted from wavelet sub-images
    Bakhshipour, Adel
    Jafari, Abdolabbas
    Nassiri, Seyed Mehdi
    Zare, Dariush
    [J]. BIOSYSTEMS ENGINEERING, 2017, 157 : 1 - 12
  • [4] Inside-Outside Net: Detecting Objects in Context with Skip Pooling and Recurrent Neural Networks
    Bell, Sean
    Zitnick, C. Lawrence
    Bala, Kavita
    Girshick, Ross
    [J]. 2016 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2016, : 2874 - 2883
  • [5] Analysis of Morphology-Based Features for Classification of Crop and Weeds in Precision Agriculture
    Bosilj, Petra
    Duckett, Tom
    Cielniak, Grzegorz
    [J]. IEEE ROBOTICS AND AUTOMATION LETTERS, 2018, 3 (04): : 2950 - 2956
  • [6] Cai ZY, 2017, IEEE IMAGE PROC, P1965, DOI 10.1109/ICIP.2017.8296625
  • [7] Site-specific weed control technologies
    Christensen, S.
    Sogaard, H. T.
    Kudsk, P.
    Norremark, M.
    Lund, I.
    Nadimi, E. S.
    Jorgensen, R.
    [J]. WEED RESEARCH, 2009, 49 (03) : 233 - 241
  • [8] Couprie C, 2014, J MACH LEARN RES, V15, P3489
  • [9] Weed growth and crop yield loss in wheat as influenced by row spacing and weed emergence times
    Fahad, Shah
    Hussain, Saddam
    Chauhan, Bhagirath Singh
    Saud, Shah
    Wu, Chao
    Hassan, Shah
    Tanveer, Mohsin
    Jan, Amanullah
    Huang, Jianliang
    [J]. CROP PROTECTION, 2015, 71 : 101 - 108
  • [10] Object Detection with Discriminatively Trained Part-Based Models
    Felzenszwalb, Pedro F.
    Girshick, Ross B.
    McAllester, David
    Ramanan, Deva
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2010, 32 (09) : 1627 - 1645