Lightweight detection networks for tea bud on complex agricultural environment via improved YOLO v4

被引:76
作者
Li, Jie [1 ]
Li, Jiehao [1 ,3 ]
Zhao, Xin [1 ,2 ]
Su, Xiaohang [3 ]
Wu, Weibin [1 ,2 ]
机构
[1] South China Agr Univ, Coll Engn, Key Lab Key Technol Agr Machine & Equipment, Minist Educ, Guangzhou 510642, Peoples R China
[2] Guangdong Engn Technol Res Ctr Creat Hilly Orchard, Guangzhou 510642, Peoples R China
[3] South China Univ Technol, Sch Comp Sci & Engn, Guangzhou 510641, Peoples R China
基金
中国国家自然科学基金;
关键词
Target detection; Neural networks; Lightweight network; Tea bud; Loss function; RECOGNITION; ROBOT;
D O I
10.1016/j.compag.2023.107955
中图分类号
S [农业科学];
学科分类号
09 ;
摘要
Rapid and accurate detection of tender tea buds in the natural tea garden environment is the basis for intelligent tea picking. However, complex models result in high hardware computing power requirements and limit the deployment of tea bud recognition models in tea-picking robots. Therefore, this paper investigates a high-precision and lightweight target detection model based on improved you-only-look-once version 4(YOLOv4). The lightweight network GhostNet is used to replace the backbone of YOLOv4, and the depthwise separable convolution is designed to substitute the standard convolution, significantly reducing the computational load and computational complexity of the model. Additionally, the convolution block attention module (CBAM) is embedded into the path aggregation network (PANet), which enhances the model's feature extraction capability. To solve the problem of detection and distinction caused by overlapping of one-bud-one-leaf and one-bud-two-leaf, this paper proposes the CIoU loss function in YOLOv4 to the SIoU loss function. The SIoU loss function considers the vector angles of the ground truth box and the prediction box and redefines the penalty indicator to improve the training speed and detection accuracy of the model. The experimental results show that the detection accuracy of the proposed approach is 85.15% for the one-bud-one-leaf and one-bud-two-leaf. The giga floating point operations per second (GFlops) and parameters are 6.594 G and 11.353 M. Relative to the original YOLOv4, the proposed algorithm's mean accuracy is improved by 1.08%, 89.11% reduces the computational complexity, and 82.36% reduces the number of parameters. The Tea-YOLO algorithm demonstrates significantly better actual detection performance in different angles and natural environments compared to the YOLOv4 algorithm. The algorithm proposed in this paper can detect one-bud-one-leaf and one-bud-two-leaf quickly and accurately, which reduces the cost and difficulty of deploying the vision module of the tea-picking robot.
引用
收藏
页数:13
相关论文
共 31 条
[1]   Hormonal regulation of health-promoting compounds in tea (Camellia sinensis L.) [J].
Ahammed, Golam Jalal ;
Li, Xin .
PLANT PHYSIOLOGY AND BIOCHEMISTRY, 2022, 185 :390-400
[2]  
Bochkovskiy A, 2020, Arxiv, DOI arXiv:2004.10934
[3]   Lightweight tea bud recognition network integrating GhostNet and YOLOv5 [J].
Cao, Miaolong ;
Fu, Hao ;
Zhu, Jiayi ;
Cai, Chenggang .
MATHEMATICAL BIOSCIENCES AND ENGINEERING, 2022, 19 (12) :12897-12914
[4]   Localizing plucking points of tea leaves using deep convolutional neural networks [J].
Chen, Yu-Ting ;
Chen, Shih-Fang .
COMPUTERS AND ELECTRONICS IN AGRICULTURE, 2020, 171
[5]   Adaptive Neural Network Control of AUVs With Control Input Nonlinearities Using Reinforcement Learning [J].
Cui, Rongxin ;
Yang, Chenguang ;
Li, Yang ;
Sharma, Sanjay .
IEEE TRANSACTIONS ON SYSTEMS MAN CYBERNETICS-SYSTEMS, 2017, 47 (06) :1019-1029
[6]  
Gevorgyan Z, 2022, Arxiv, DOI arXiv:2205.12740
[7]   GhostNet: More Features from Cheap Operations [J].
Han, Kai ;
Wang, Yunhe ;
Tian, Qi ;
Guo, Jianyuan ;
Xu, Chunjing ;
Xu, Chang .
2020 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2020, :1577-1586
[8]   PAG-YOLO: A Portable Attention-Guided YOLO Network for Small Ship Detection [J].
Hu, Jianming ;
Zhi, Xiyang ;
Shi, Tianjun ;
Zhang, Wei ;
Cui, Yang ;
Zhao, Shenggang .
REMOTE SENSING, 2021, 13 (16)
[9]   Real-time detection of uneaten feed pellets in underwater images for aquaculture using an improved YOLO-V4 network [J].
Hu, Xuelong ;
Liu, Yang ;
Zhao, Zhengxi ;
Liu, Jintao ;
Yang, Xinting ;
Sun, Chuanheng ;
Chen, Shuhan ;
Li, Bin ;
Zhou, Chao .
COMPUTERS AND ELECTRONICS IN AGRICULTURE, 2021, 185
[10]   Deep learning in agriculture: A survey [J].
Kamilaris, Andreas ;
Prenafeta-Boldu, Francesc X. .
COMPUTERS AND ELECTRONICS IN AGRICULTURE, 2018, 147 :70-90