Computer-Vision Based Real Time Waypoint Generation for Autonomous Vineyard Navigation with Quadruped Robots

被引:1
作者
Milburn, Lee [1 ,2 ]
Gamba, Juan [3 ]
Fernandes, Miguel [4 ,5 ]
Semini, Claudio [3 ]
机构
[1] IIT, River Lab, Genoa, Italy
[2] NEU, Boston, MA 02115 USA
[3] Ist Italiano Tecnol, Dynam Legged Syst Lab, Genoa, Italy
[4] IIT, Adv Robot Lab, Genoa, Italy
[5] UniGe, Genoa, Italy
来源
2023 IEEE INTERNATIONAL CONFERENCE ON AUTONOMOUS ROBOT SYSTEMS AND COMPETITIONS, ICARSC | 2023年
关键词
Agricultural Robotics; Computer-Vision; Autonomous Vineyard Navigation; Quadruped Control; VEHICLES;
D O I
10.1109/ICARSC58346.2023.10129563
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The VINUM project seeks to address the shortage of skilled labor in modern vineyards by introducing a cuttingedge mobile robotic solution. Leveraging the capabilities of the quadruped robot, HyQReal, this system, equipped with arm and vision sensors, offers autonomous navigation and winter pruning of grapevines reducing the need for human intervention. At the heart of this approach lies an architecture that empowers the robot to easily navigate vineyards, identify grapevines with unparalleled accuracy, and approach them for pruning with precision. A state machine drives the process, deftly switching between various stages to ensure seamless and efficient task completion. The system's performance was assessed through experimentation, focusing on waypoint precision and optimizing the robot's workspace for single-plant operations. Results indicate that the architecture is highly reliable, with a mean error of 21.5cm and a standard deviation of 17.6cm for HyQReal. However, improvements in grapevine detection accuracy are necessary for optimal performance. This work is based on a computer-vision-based navigation method for quadruped robots in vineyards, opening up new possibilities for selective task automation. The system's architecture works well in ideal weather conditions, generating and arriving at precise waypoints that maximize the attached robotic arm's workspace. This work is an extension of our short paper presented at the Italian Conference on Robotics and Intelligent Machines (I-RIM), 2022 [1].
引用
收藏
页码:239 / 244
页数:6
相关论文
共 22 条
  • [1] Local Motion Planner for Autonomous Navigation in Vineyards with a RGB-D Camera-Based Algorithm and Deep Learning Synergy
    Aghi, Diego
    Mazzia, Vittorio
    Chiaberge, Marcello
    [J]. MACHINES, 2020, 8 (02)
  • [2] Vineyard Autonomous Navigation in the Echord plus plus GRAPE Experiment
    Astolfi, Pietro
    Gabrielli, Alessandro
    Bascetta, Luca
    Matteucci, Matteo
    [J]. IFAC PAPERSONLINE, 2018, 51 (11): : 704 - 709
  • [3] Robot Farmers Autonomous Orchard Vehicles Help Tree Fruit Production
    Bergerman, Marcel
    Maeta, Silvio M.
    Zhang, Ji
    Freitas, Gustavo M.
    Hamner, Bradley
    Singh, Sanjiv
    Kantor, George
    [J]. IEEE ROBOTICS & AUTOMATION MAGAZINE, 2015, 22 (01) : 54 - 63
  • [4] Development of quadruped walking robots: A review
    Biswal, Priyaranjan
    Mohanty, Prases K.
    [J]. AIN SHAMS ENGINEERING JOURNAL, 2021, 12 (02) : 2017 - 2031
  • [5] A Robot System for Pruning Grape Vines
    Botterill, Tom
    Paulin, Scott
    Green, Richard
    Williams, Samuel
    Lin, Jessica
    Saxton, Valerie
    Mills, Steven
    Chen, XiaoQi
    Corbett-Davies, Sam
    [J]. JOURNAL OF FIELD ROBOTICS, 2017, 34 (06) : 1100 - 1122
  • [6] A Navigation Architecture for Ackermann Vehicles in Precision Farming
    Carpio, Renzo Fabrizio
    Potena, Ciro
    Maiolini, Jacopo
    Ulivi, Giovanni
    Rossell, Nicolas Bono
    Garone, Emanuele
    Gasparri, Andrea
    [J]. IEEE ROBOTICS AND AUTOMATION LETTERS, 2020, 5 (02): : 1103 - 1110
  • [7] Fernandes Miguel, 2021, 2021 IEEE 11th Annual International Conference on CYBER Technology in Automation, Control, and Intelligent Systems (CYBER), P13, DOI 10.1109/CYBER53097.2021.9588303
  • [8] Fernandes M., 2021, arXiv
  • [9] Guadagna P., 2021, PRECISION AGR 21, DOI [10.3920/978-90-8686-916-9_16, DOI 10.3920/978-90-8686-916-9_16]
  • [10] Hroob I, 2021, Arxiv, DOI arXiv:2107.05283