From one field to another-Unsupervised domain adaptation for semantic segmentation in agricultural robotics

被引:16
作者
Magistri, Federico [1 ]
Weyler, Jan [1 ]
Gogoll, Dario [1 ]
Lottes, Philipp [1 ]
Behley, Jens [1 ]
Petrinic, Nik [2 ]
Stachniss, Cyrill [1 ,3 ,4 ]
机构
[1] Univ Bonn, Photogrammetry & Robot Lab, Nussallee 15, D-53115 Bonn, Germany
[2] Univ Oxford, Impact Engn Lab, Oxford OX1 2JD, England
[3] Lamarr Inst Machine Learning & Artificial Intellig, St Augustin, Germany
[4] Univ Oxford, Dept Engn Sci, Oxford OX1 3PJ, England
关键词
Domain adaptation; Semantic segmentation; Deep learning; Generative adversarial networks; IMAGE-ANALYSIS; NETWORKS; CROP;
D O I
10.1016/j.compag.2023.108114
中图分类号
S [农业科学];
学科分类号
09 ;
摘要
In traditional arable crop fields, tractors treat the whole field uniformly applying large quantities of herbicides and pesticides for weed control and plant protection. Autonomous robots, instead, offer the potential to provide a per-plant treatment, thus turning weed control and plant protection environment-friendly. To this end, an autonomous robot has to reliably distinguish crops, weeds, and soil under a diverse range of environmental conditions using its onboard sensors. Such recognition ability forms the basis for targeted plant-specific treatments in the form of spot applications. Basically, all such perception systems used today rely on some form of machine learning technique. However, current learning-based solutions often show a performance decay when applied under new field conditions. This is a major bottleneck for real-world application and finally commercial adoption. In this paper, we propose a simple yet effective approach to unsupervised domain adaptation for semantic segmentation systems so that an existing segmentation pipeline can be adapted to different fields, different robots, and different crops. Our system yields a high segmentation performance in new target fields without the need for extra manual annotations. It exploits only annotations from the source domain, i.e., the original field used for training the robot's vision system. Our thorough evaluation shows that our approach achieves high accuracy when transferring an existing segmentation system to different environmental conditions, different plant species, and different robotic systems.
引用
收藏
页数:10
相关论文
共 51 条
[1]   Future farms without farmers [J].
Asseng, Senthold ;
Asche, Frank .
SCIENCE ROBOTICS, 2019, 4 (27)
[2]   Automatic UAV-based counting of seedlings in sugar-beet field and extension to maize and strawberry [J].
Barreto, Abel ;
Lottes, Philipp ;
Yamati, Facundo Ramon Ispizua ;
Baumgarten, Stephen ;
Wolf, Nina Anastasia ;
Stachniss, Cyrill ;
Mahlein, Anne-Katrin ;
Paulus, Stefan .
COMPUTERS AND ELECTRONICS IN AGRICULTURE, 2021, 191
[3]   A comparative study of Fourier transform and CycleGAN as domain adaptation techniques for weed segmentation [J].
Bertoglio, Riccardo ;
Mazzucchelli, Alessio ;
Catalano, Nico ;
Matteucci, Matteo .
SMART AGRICULTURAL TECHNOLOGY, 2023, 4
[4]   Active learning with MaskAL reduces annotation effort for training Mask R-CNN on a broccoli dataset with visually similar classes [J].
Blok, Pieter M. ;
Kootstra, Gert ;
Elghor, Hakim Elchaoui ;
Diallo, Boubacar ;
van Evert, Frits K. ;
van Henten, Eldert J. .
COMPUTERS AND ELECTRONICS IN AGRICULTURE, 2022, 197
[5]  
Chen T., 2020, PROC INT C MACHINE L
[6]   CrDoCo: Pixel-level Domain Transfer with Cross-Domain Consistency [J].
Chen, Yun-Chun ;
Lin, Yen-Yu ;
Yang, Ming-Hsuan ;
Huang, Jia-Bin .
2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, :1791-1800
[7]  
Dhariwal P, 2021, ADV NEUR IN, V34
[8]  
Di Cicco M, 2017, IEEE INT C INT ROBOT, P5188, DOI 10.1109/IROS.2017.8206408
[9]   Unsupervised Domain Adaptation for Transferring Plant Classification Systems to New Field Environments, Crops, and Robots [J].
Gogoll, Dario ;
Lottes, Philipp ;
Weyler, Jan ;
Petrinic, Nik ;
Stachniss, Cyrill .
2020 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2020, :2636-2642
[10]  
Goodfellow IJ, 2014, ADV NEUR IN, V27, P2672