Multi-level feature re-weighted fusion for the semantic segmentation of crops and weeds

被引:17
作者
Janneh, Lamin L. [1 ,2 ]
Zhang, Yongjun [1 ]
Cui, Zhongwei [3 ]
Yang, Yitong [1 ]
机构
[1] Guizhou Univ, Sch Comp Sci & Technol, State Key Lab Publ Big Data, Guiyang 550025, Peoples R China
[2] Univ Gambia UTG, Sch Informat Commun & Technol, Peace Bldg,POB 3530, Banjul, Kanifing, Gambia
[3] Guizhou Educ Univ, Sch Math & Big Data, Guiyang 550018, Peoples R China
关键词
Deep learning; Pixel-wise classification; Semantic segmentation; Weeds detection; Precision farming; Robotic vision;
D O I
10.1016/j.jksuci.2023.03.023
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Intelligent farm robots empowered by proper vision algorithms are the new agricultural machinery that eases weed control with speed and accuracy. Based on the farmland substantial similarity between the crops, and weeds, or other background interference objects, an improved deep convolutional neural network (DCNN) algorithms is proposed for the pixel semantic segmentation of crop and weed. First, a lightweight backbone is proposed to balance the features map textual and shape signals, which are essential cues for better crop and weed prediction. Second, a multi-level feature re-weighted fusion (MFRWF) module is suggested to combine only the relevant information from every backbone layer output to improve the contextual maps of crops and weeds. Finally, a decoder is designed based on convolutional weighted fusion (CWF) to preserve the relevant crop and weed context information by reducing the possible feature context distortion. Experimental results show that our improved neural network obtained the mean intersection of union (MIOU) scores of 0.8646, 0.9164, and 0.8459 on the carrot/weed field image (CWFID), sugar beet (BoniRob), and Rice seedling datasets, respectively. Therefore, the results have not only outperformed the commonly used architectures but can precisely identify crops/weeds and substantially improve the robot inference speed with minimal memory overhead. The code is available at: https://github.com/jannehlamin/MFRWF. (c) 2023 The Authors. Published by Elsevier B.V. on behalf of King Saud University. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/).
引用
收藏
页数:13
相关论文
共 33 条
[1]  
Alam M, 2020, 2020 7TH INTERNATIONAL CONFERENCE ON ELECTRICAL AND ELECTRONICS ENGINEERING (ICEEE 2020), P273, DOI [10.1109/iceee49618.2020.9102505, 10.1109/ICEEE49618.2020.9102505]
[2]   Hybrid leader based optimization with deep learning driven weed detection on internet of things enabled smart agriculture environment [J].
Alrowais, Fadwa ;
Asiri, Mashael M. ;
Alabdan, Rana ;
Marzouk, Radwa ;
Hilal, Anwer Mustafa ;
Alkhayyat, Ahmed ;
Gupta, Deepak .
COMPUTERS & ELECTRICAL ENGINEERING, 2022, 104
[3]   SegNet: A Deep Convolutional Encoder-Decoder Architecture for Image Segmentation [J].
Badrinarayanan, Vijay ;
Kendall, Alex ;
Cipolla, Roberto .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2017, 39 (12) :2481-2495
[4]  
Bergstra J, 2012, J MACH LEARN RES, V13, P281
[5]   Color index based thresholding method for background and foreground segmentation of plant images [J].
Castillo-Martinez, Miguel A. ;
Gallegos-Funes, Francisco J. ;
Carvajal-Gamez, Blanca E. ;
Urriolagoitia-Sosa, Guillermo ;
Rosales-Silva, Alberto J. .
COMPUTERS AND ELECTRONICS IN AGRICULTURE, 2020, 178
[6]   Agricultural robot dataset for plant classification, localization and mapping on sugar beet fields [J].
Chebrolu, Nived ;
Lottes, Philipp ;
Schaefer, Alexander ;
Winterhalter, Wera ;
Burgard, Wolfram ;
Stachniss, Cyrill .
INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, 2017, 36 (10) :1045-1052
[7]   Weed detection in sesame fields using a YOLO model with an enhanced attention mechanism and feature fusion [J].
Chen, Jiqing ;
Wang, Huabin ;
Zhang, Hongdu ;
Luo, Tian ;
Wei, Depeng ;
Long, Teng ;
Wang, Zhikui .
COMPUTERS AND ELECTRONICS IN AGRICULTURE, 2022, 202
[8]  
Chen LB, 2017, IEEE INT SYMP NANO, P1, DOI 10.1109/NANOARCH.2017.8053709
[9]   Deep learning-based early weed segmentation using motion blurred UAV images of sorghum fields [J].
Genze, Nikita ;
Ajekwe, Raymond ;
Guereli, Zeynep ;
Haselbeck, Florian ;
Grieb, Michael ;
Grimm, Dominik G. .
COMPUTERS AND ELECTRONICS IN AGRICULTURE, 2022, 202
[10]  
Haug S, 2015, LECT NOTES COMPUT SC, V8928, P105, DOI [10.1007/978-3-319-16220-1-8, 10.1007/978-3-319-16220-1_8]