Augmented Perception for Agricultural Robots Navigation

被引:46
作者
Rovira-Mas, Francisco [1 ]
Saiz-Rubio, Veronica [1 ]
Cuenca-Cuenca, Andres [1 ]
机构
[1] Univ Politecn Valencia, Agr Robot Lab, Valencia 46022, Spain
关键词
Robot sensing systems; Three-dimensional displays; Agriculture; Navigation; Cameras; 3D Vision; field robotics; autonomous navigation; digital farming; local perception; sensor fusion; STEREO VISION; MAPS;
D O I
10.1109/JSEN.2020.3016081
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Producing food in a sustainable way is becoming very challenging today due to the lack of skilled labor, the unaffordable costs of labor when available, and the limited returns for growers as a result of low produce prices demanded by big supermarket chains in contrast to ever-increasing costs of inputs such as fuel, chemicals, seeds, or water. Robotics emerges as a technological advance that can counterweight some of these challenges, mainly in industrialized countries. However, the deployment of autonomous machines in open environments exposed to uncertainty and harsh ambient conditions poses an important defiance to reliability and safety. Consequently, a deep parametrization of the working environment in real time is necessary to achieve autonomous navigation. This article proposes a navigation strategy for guiding a robot along vineyard rows for field monitoring. Given that global positioning cannot be granted permanently in any vineyard, the strategy is based on local perception, and results from fusing three complementary technologies: 3D vision, lidar, and ultrasonics. Several perception-based navigation algorithms were developed between 2015 and 2019. After their comparison in real environments and conditions, results showed that the augmented perception derived from combining these three technologies provides a consistent basis for outlining the intelligent behavior of agricultural robots operating within orchards.
引用
收藏
页码:11712 / 11727
页数:16
相关论文
共 22 条
[1]  
Andersen G.L., 2019, Land Technik AgEng 2019., P455
[2]  
[Anonymous], 1996, Robot spatial perception by stereoscopic vision and 3d evidence grids
[3]   3D Perception-Based Collision-Free Robotic Leaf Probing for Automated Indoor Plant Phenotyping [J].
Bao, Y. ;
Shah, D. ;
Tang, L. .
TRANSACTIONS OF THE ASABE, 2018, 61 (03) :859-872
[4]  
Benet B., 2017, P FRONT ED C IND IN, P1
[5]   Robot navigation in orchards with localization based on Particle filter and Kalman filter [J].
Blok, Pieter M. ;
van Boheemen, Koen ;
van Evert, Frits K. ;
IJsselmuiden, Joris ;
Kim, Gook-Hwan .
COMPUTERS AND ELECTRONICS IN AGRICULTURE, 2019, 157 :261-269
[6]  
Dickon M., 2002, US Patent, Patent No. [US 6445983 Bl, 6445983, 6445983 B1]
[7]  
Hamner B., 2010, IMPROVING ORCHARD EF, P1
[8]  
Iida M., 2018, P ASABE ANN INT M, P1
[9]  
Kohanbash D., 2012, P ASABE ANN INT M
[10]  
McCorduck P., 2004, MACHINES WHO THINK, P243