Efficient tomato harvesting robot based on image processing and deep learning

被引:44
作者
Miao, Zhonghua [1 ]
Yu, Xiaoyou [1 ]
Li, Nan [1 ]
Zhang, Zhe [1 ]
He, Chuangxin [1 ]
Li, Zhao [1 ]
Deng, Chunyu [1 ]
Sun, Teng [1 ]
机构
[1] Shanghai Univ, Sch Mechatron Engn & Automat, Dept Automat, Intelligent Equipment & Robot Lab, Shangda St 99, Shanghai, Peoples R China
关键词
Image processing; YOLOv5; network; Agriculture robot; Tomato harvesting; MACHINE VISION; LOCALIZATION;
D O I
10.1007/s11119-022-09944-w
中图分类号
S [农业科学];
学科分类号
09 ;
摘要
Agricultural robots are rapidly becoming more advanced with the development of relevant technologies and in great demand to guarantee food supply. As such, they are slated to play an important role in precision agriculture. For tomato production, harvesting employs over 40% of the total workforce. Therefore, it is meaningful to develop a robot harvester to assist workers. The objective of this work is to understand the factors restricting the recognition accuracy using image processing and deep learning methods, and improve the performance of crop detection in agricultural complex environment. With the accurate recognition of the growing status and location of crops, temporal management of the crop and selective harvesting can be available, and issues caused by the growing shortage of agricultural labour can be alleviated. In this respect, this work integrates the classic image processing methods with the YOLOv5 (You only look once version 5) network to increase the accuracy and robustness of tomato and stem perception. As a consequence, an algorithm to estimate the degree of maturity of truss tomatoes (clusters of individual tomatoes) and an integrated method to locate stems based on the resultant experiments error of each individual method were proposed. Both indoor and real-filed tests were carried out using a robot harvester. The results proved the high accuracy of the proposed algorithms under varied illumination conditions, with an average deviation of 2 mm from the ground-truth. The robot can be guided to harvest truss tomatoes efficiently, with an average operating time of 9 s/cluster.
引用
收藏
页码:254 / 287
页数:34
相关论文
共 31 条
  • [1] Human-robot collaborative site-specific sprayer
    Berenstein, Ron
    Edan, Yael
    [J]. JOURNAL OF FIELD ROBOTICS, 2017, 34 (08) : 1519 - 1530
  • [2] Chen XY, 2015, IEEE INT C INT ROBOT, P6487, DOI 10.1109/IROS.2015.7354304
  • [3] Transfer Path Analysis and Contribution Evaluation Using SVD- and PCA-Based Operational Transfer Path Analysis
    Cheng, Wei
    Blamaud, Diane
    Chu, Yapeng
    Meng, Lei
    Lu, Jingbai
    Basit, Wajid Ali
    [J]. SHOCK AND VIBRATION, 2020, 2020
  • [4] Viewpoint: The future of work in agri-food
    Christiaensen, Luc
    Rutledge, Zachariah
    Taylor, J. Edward
    [J]. FOOD POLICY, 2021, 99
  • [5] Feng QC, 2015, 2015 IEEE INTERNATIONAL CONFERENCE ON INFORMATION AND AUTOMATION, P949, DOI 10.1109/ICInfA.2015.7279423
  • [6] A robotic irrigation system for urban gardening and agriculture
    Gravalos, Ioannis
    Avgousti, Avgoustinos
    Gialamas, Theodoros
    Alfieris, Nikolaos
    Paschalidis, Georgios
    [J]. JOURNAL OF AGRICULTURAL ENGINEERING, 2019, 50 (04) : 198 - 207
  • [7] Hess W, 2016, IEEE INT CONF ROBOT, P1271, DOI 10.1109/ICRA.2016.7487258
  • [8] Fruit recognition based on pulse coupled neural network and genetic Elman algorithm application in apple harvesting robot
    Jia, Weikuan
    Mou, Shanhao
    Wang, Jing
    Liu, Xiaoyang
    Zheng, Yuanjie
    Lian, Jian
    Zhao, Dean
    [J]. INTERNATIONAL JOURNAL OF ADVANCED ROBOTIC SYSTEMS, 2020, 17 (01):
  • [9] Integrating machine vision-based row guidance with GPS and compass-based routing to achieve autonomous navigation for a rice field weeding robot
    Kanagasingham, Sabeethan
    Ekpanyapong, Mongkol
    Chaihan, Rachan
    [J]. PRECISION AGRICULTURE, 2020, 21 (04) : 831 - 855
  • [10] Autonomous Sweet Pepper Harvesting for Protected Cropping Systems
    Lehnert, Christopher
    English, Andrew
    McCool, Christopher
    Tow, Adam W.
    Perez, Tristan
    [J]. IEEE ROBOTICS AND AUTOMATION LETTERS, 2017, 2 (02): : 872 - 879