Vision-Based Localization Method for Picking Points in Tea-Harvesting Robots

被引:3
作者
Yang, Jingwen [1 ]
Li, Xin [1 ]
Wang, Xin [1 ]
Fu, Leiyang [1 ]
Li, Shaowen [1 ]
机构
[1] Anhui Agr Univ, Sch Informat & Artificial Intelligence, Key Lab Agr Sensors, Minist Agr & Rural Affairs, Hefei 230036, Peoples R China
基金
中国国家自然科学基金;
关键词
deep learning; RGB-D; tea; picking point localization;
D O I
10.3390/s24216777
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
To address the issue of accurately recognizing and locating picking points for tea-picking robots in unstructured environments, a visual positioning method based on RGB-D information fusion is proposed. First, an improved T-YOLOv8n model is proposed, which improves detection and segmentation performance across multi-scale scenes through network architecture and loss function optimizations. In the far-view test set, the detection accuracy of tea buds reached 80.8%; for the near-view test set, the mAP0.5 values for tea stem detection in bounding boxes and masks reached 93.6% and 93.7%, respectively, showing improvements of 9.1% and 14.1% over the baseline model. Secondly, a layered visual servoing strategy for near and far views was designed, integrating the RealSense depth sensor with robotic arm cooperation. This strategy identifies the region of interest (ROI) of the tea bud in the far view and fuses the stem mask information with depth data to calculate the three-dimensional coordinates of the picking point. The experiments show that this method achieved a picking point localization success rate of 86.4%, with a mean depth measurement error of 1.43 mm. The proposed method improves the accuracy of picking point recognition and reduces depth information fluctuations, providing technical support for the intelligent and rapid picking of premium tea.
引用
收藏
页数:18
相关论文
共 48 条
[1]   3D Sensors for Sewer Inspection: A Quantitative Review and Analysis [J].
Bahnsen, Chris H. ;
Johansen, Anders S. ;
Philipsen, Mark P. ;
Henriksen, Jesper W. ;
Nasrollahi, Kamal ;
Moeslund, Thomas B. .
SENSORS, 2021, 21 (07)
[2]  
Bello RW, 2024, Artificial Intelligence and Applications, V2, P115, DOI [10.47852/bonviewaia42021603, 10.47852/bonviewAIA42021603, DOI 10.47852/BONVIEWAIA42021603]
[3]  
Bolya D., P 2019 IEEE CVF INT, DOI 10.1109ICCV.2019.00925
[4]   A YOLOv3-based computer vision system for identification of tea buds and the picking point [J].
Chen, Chunlin ;
Lu, Jinzhu ;
Zhou, Mingchuan ;
Yi, Jiao ;
Liao, Min ;
Gao, Zongmei .
COMPUTERS AND ELECTRONICS IN AGRICULTURE, 2022, 198
[5]   Dynamic visual servo control methods for continuous operation of a fruit harvesting robot working throughout an orchard [J].
Chen, Mingyou ;
Chen, Zengxing ;
Luo, Lufeng ;
Tang, Yunchao ;
Cheng, Jiabing ;
Wei, Huiling ;
Wang, Jinhai .
COMPUTERS AND ELECTRONICS IN AGRICULTURE, 2024, 219
[6]   Localizing plucking points of tea leaves using deep convolutional neural networks [J].
Chen, Yu-Ting ;
Chen, Shih-Fang .
COMPUTERS AND ELECTRONICS IN AGRICULTURE, 2020, 171
[7]   Application of consumer RGB-D cameras for fruit detection and localization in field: A critical review [J].
Fu, Longsheng ;
Gao, Fangfang ;
Wu, Jingzhu ;
Li, Rui ;
Karkee, Manoj ;
Zhang, Qin .
COMPUTERS AND ELECTRONICS IN AGRICULTURE, 2020, 177
[8]  
Howard AG, 2017, Arxiv, DOI [arXiv:1704.04861, 10.48550/arXiv.1704.04861]
[9]  
Gonzalez R.C., 2009, DIGITAL IMAGE PROCES, DOI [DOI 10.1117/1.3115362, 10.1117/1.3115362]
[10]   Simultaneous detection of fruits and fruiting stems in mango using improved YOLOv8 model deployed by edge device [J].
Gu, Zenan ;
He, Deqiang ;
Huang, Junduan ;
Chen, Jiqing ;
Wu, Xiuhong ;
Huang, Bincheng ;
Dong, Tianyun ;
Yang, Qiumei ;
Li, Hongwei .
COMPUTERS AND ELECTRONICS IN AGRICULTURE, 2024, 227