Efficient tomato harvesting robot based on image processing and deep learning

被引:44
作者
Miao, Zhonghua [1 ]
Yu, Xiaoyou [1 ]
Li, Nan [1 ]
Zhang, Zhe [1 ]
He, Chuangxin [1 ]
Li, Zhao [1 ]
Deng, Chunyu [1 ]
Sun, Teng [1 ]
机构
[1] Shanghai Univ, Sch Mechatron Engn & Automat, Dept Automat, Intelligent Equipment & Robot Lab, Shangda St 99, Shanghai, Peoples R China
关键词
Image processing; YOLOv5; network; Agriculture robot; Tomato harvesting; MACHINE VISION; LOCALIZATION;
D O I
10.1007/s11119-022-09944-w
中图分类号
S [农业科学];
学科分类号
09 ;
摘要
Agricultural robots are rapidly becoming more advanced with the development of relevant technologies and in great demand to guarantee food supply. As such, they are slated to play an important role in precision agriculture. For tomato production, harvesting employs over 40% of the total workforce. Therefore, it is meaningful to develop a robot harvester to assist workers. The objective of this work is to understand the factors restricting the recognition accuracy using image processing and deep learning methods, and improve the performance of crop detection in agricultural complex environment. With the accurate recognition of the growing status and location of crops, temporal management of the crop and selective harvesting can be available, and issues caused by the growing shortage of agricultural labour can be alleviated. In this respect, this work integrates the classic image processing methods with the YOLOv5 (You only look once version 5) network to increase the accuracy and robustness of tomato and stem perception. As a consequence, an algorithm to estimate the degree of maturity of truss tomatoes (clusters of individual tomatoes) and an integrated method to locate stems based on the resultant experiments error of each individual method were proposed. Both indoor and real-filed tests were carried out using a robot harvester. The results proved the high accuracy of the proposed algorithms under varied illumination conditions, with an average deviation of 2 mm from the ground-truth. The robot can be guided to harvest truss tomatoes efficiently, with an average operating time of 9 s/cluster.
引用
收藏
页码:254 / 287
页数:34
相关论文
共 31 条
  • [21] Robust Tracking Control of Robot Manipulators With Actuator Faults and Joint Velocity Measurement Uncertainty
    Xiao, Bing
    Cao, Lu
    Xu, Shengyuan
    Liu, Liang
    [J]. IEEE-ASME TRANSACTIONS ON MECHATRONICS, 2020, 25 (03) : 1354 - 1365
  • [22] An obstacle separation method for robotic picking of fruits in clusters
    Xiong, Ya
    Ge, Yuanyue
    From, Pal Johan
    [J]. COMPUTERS AND ELECTRONICS IN AGRICULTURE, 2020, 175
  • [23] A Forest Fire Detection System Based on Ensemble Learning
    Xu, Renjie
    Lin, Haifeng
    Lu, Kangjie
    Cao, Lin
    Liu, Yunfei
    [J]. FORESTS, 2021, 12 (02): : 1 - 17
  • [24] Apple Detection in Natural Environment Using Deep Learning Algorithms
    Xuan, Guantao
    Gao, Chong
    Shao, Yuanyuan
    Zhang, Meng
    Wang, Yongxian
    Zhong, Jingrun
    Li, Qingguo
    Peng, Hongxing
    [J]. IEEE ACCESS, 2020, 8 : 216772 - 216780
  • [25] Yaguchi H, 2016, 2016 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS 2016), P652, DOI 10.1109/IROS.2016.7759122
  • [26] A Real-Time Apple Targets Detection Method for Picking Robot Based on Improved YOLOv5
    Yan, Bin
    Fan, Pan
    Lei, Xiaoyan
    Liu, Zhijie
    Yang, Fuzeng
    [J]. REMOTE SENSING, 2021, 13 (09)
  • [27] Yoshida T, 2019, IEEE/SICE I S SYS IN, P456, DOI [10.1109/sii.2019.8700358, 10.1109/SII.2019.8700358]
  • [28] Fast Detection of Tomato Peduncle Using Point Cloud with a Harvesting Robot
    Yoshida, Takeshi
    Fukao, Takanori
    Hasegawa, Takaomi
    [J]. JOURNAL OF ROBOTICS AND MECHATRONICS, 2018, 30 (02) : 180 - 186
  • [29] Real-Time Visual Localization of the Picking Points for a Ridge-Planting Strawberry Harvesting Robot
    Yu, Yang
    Zhang, Kailiang
    Liu, Hui
    Yang, Li
    Zhang, Dongxing
    [J]. IEEE ACCESS, 2020, 8 (08): : 116556 - 116568
  • [30] Zhang Z, 2017, HORTTECHNOLOGY, V27, P240, DOI [10.21273/horttech03548-16, 10.21273/HORTTECH03548-16]