共 50 条
DGBench: An Open-Source, Reproducible Benchmark for Dynamic Grasping
被引:4
|作者:
Burgess-Limerick, Ben
[1
]
Lehnert, Chris
[1
]
Leitner, Jurgen
[2
]
Corke, Peter
[1
]
机构:
[1] Queensland Univ Technol, Ctr Robot QCR, Brisbane, Qld, Australia
[2] LYRO Robot, Brisbane, Qld, Australia
关键词:
Performance Evaluation and Benchmarking;
Perception for Grasping and Manipulation;
Grasping;
OBJECT;
VISION;
D O I:
10.1109/IROS47612.2022.9981670
中图分类号:
TP [自动化技术、计算机技术];
学科分类号:
0812 ;
摘要:
This paper introduces DGBench, a fully reproducible open-source testing system to enable benchmarking of dynamic grasping in environments with unpredictable relative motion between robot and object. We use the proposed benchmark to compare several visual perception arrangements. Traditional perception systems developed for static grasping are unable to provide feedback during the final phase of a grasp due to sensor minimum range, occlusion, and a limited field of view. A multi-camera eye-in-hand perception system is presented that has advantages over commonly used camera configurations. We quantitatively evaluate the performance on a real robot with an image-based visual servoing grasp controller and show a significantly improved success rate on a dynamic grasping task.
引用
收藏
页码:3218 / 3224
页数:7
相关论文