Maize crop row recognition algorithm based on improved UNet network

被引:40
作者
Diao, Zhihua [1 ]
Guo, Peiliang [1 ]
Zhang, Baohua [2 ]
Zhang, Dongyan [3 ]
Yan, Jiaonan [1 ]
He, Zhendong [1 ]
Zhao, Suna [1 ]
Zhao, Chunjiang [4 ]
机构
[1] Zhengzhou Univ Light Ind, Sch Elect Informat Engn, Zhengzhou 450002, Peoples R China
[2] Nanjing Agr Univ, Coll Artificial Intelligence, Nanjing 211800, Peoples R China
[3] Anhui Univ, Natl Engn Res Ctr Agroecol Big Data Anal & Applica, Hefei 230601, Peoples R China
[4] Beijing Acad Agr & Forestry Sci, Informat Technol Res Ctr, Beijing 100097, Peoples R China
关键词
Maize crop row detection; Improved UNet network; Improved vertical projection method; Least squares method; ROBOT;
D O I
10.1016/j.compag.2023.107940
中图分类号
S [农业科学];
学科分类号
09 ;
摘要
Aiming at the problem that it is difficult to identify maize crop row centerlines in complex farmland environments such as high weeds, row breaking, and leaf adhesion under different growth periods, this study proposes a centerline detection algorithm based on a improved UNet network. The UNet network - a traditional semantic segmentation network - was enhanced to create the Atrous Spatial Pyramid Pooling UNet (ASPP-UNet) network for maize crop row and background segmentation, and the improved vertical projection method was subsequently employed to measure the crop rows' feature points. Finally, the least squares method was used to fit the centerlines. Experimental results yielded the Mean Intersection Over Union, Mean Pixel Accuracy, Mean Precision, and Mean Recall metrics of ASPP-UNet network to be 83.23%, 90.18%, 91.79%, and 90.18% respectively. These figures represent respective increases of 10.03%, 11.86%, 9.43% and 11.24% compared to the Fully Convolutional Network (FCN), and 7.80%, 5.52%, 2.71%, and 5.52% compared to the UNet. Furthermore, the average fitting time and angle error of the improved vertical projection method combined with the least square method were reduced to 66 ms and 4.37 degrees, compared to 80 ms and 6.12 degrees in the traditional vertical projection method, and 86 ms and 5.67 degrees in the left and right edge centerline method. Likewise, the accuracy of proposed method increased to 92.59%, compared to 87.21% and 90.16% of the two aforementioned methods, respectively. Therefore, the proposed method successfully meets the accuracy and real-time demands of agricultural robot vision navigation, and can function effectively under varying environmental pressures.
引用
收藏
页数:11
相关论文
共 25 条
[1]   CRowNet: Deep Network for Crop Row Detection in UAV Images [J].
Bah, Mamadou Dian ;
Hafiane, Adel ;
Canals, Raphael .
IEEE ACCESS, 2020, 8 :5189-5200
[2]   Robot for weed species plant-specific management [J].
Bawden, Owen ;
Kulk, Jason ;
Russell, Ray ;
McCool, Chris ;
English, Andrew ;
Dayoub, Feras ;
Lehnert, Chris ;
Perez, Tristan .
JOURNAL OF FIELD ROBOTICS, 2017, 34 (06) :1179-1199
[3]   Sugarcane nodes identification algorithm based on sum of local pixel of minimum points of vertical projection function [J].
Chen, Jiqing ;
Wu, Jiahua ;
Qiang, Hu ;
Zhou, Bobo ;
Xu, Guanwen ;
Wang, Zhikui .
COMPUTERS AND ELECTRONICS IN AGRICULTURE, 2021, 182
[4]   Extraction of navigation line based on improved grayscale factor in corn field [J].
Chen, Jiqing ;
Qiang, Hu ;
Xu, Guanwen ;
Liu, Xu ;
Mo, Rongxian ;
Huang, Renzhi .
CIENCIA RURAL, 2020, 50 (05)
[5]  
Chen LB, 2017, IEEE INT SYMP NANO, P1, DOI 10.1109/NANOARCH.2017.8053709
[6]   Corn seedling recognition algorithm based on hyperspectral image and lightweight-3D-CNN [J].
Diao, Zhihua ;
Yan, Jiaonan ;
He, Zhendong ;
Zhao, Suna ;
Guo, Peiliang .
COMPUTERS AND ELECTRONICS IN AGRICULTURE, 2022, 201
[7]   Fully Dense UNet for 2-D Sparse Photoacoustic Tomography Artifact Removal [J].
Guan, Steven ;
Khan, Amir A. ;
Sikdar, Siddhartha ;
Chitnis, Parag V. .
IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS, 2020, 24 (02) :568-576
[8]  
Huang L.L., 2022, CHINESE J AGR MECH, V43, P146
[9]   Adaptive Multi-ROI Agricultural Robot Navigation Line Extraction Based on Image Semantic Segmentation [J].
Li, Xia ;
Su, Junhao ;
Yue, Zhenchao ;
Duan, Fangtao .
SENSORS, 2022, 22 (20)
[10]  
Liu C.C., 2021, INSTR TECHNOL, DOI [10.19432/j.cnki.issn1006-2394.2021.03.016, DOI 10.19432/J.CNKI.ISSN1006-2394.2021.03.016]