EGNet: Efficient Robotic Grasp Detection Network

被引:14
作者
Yu, Sheng [1 ]
Zhai, Di-Hua [1 ]
Xia, Yuanqing [1 ]
机构
[1] Beijing Inst Technol, Sch Automat, Beijing 100081, Peoples R China
基金
中国国家自然科学基金;
关键词
Grasping detection; manipulation relationship detection; object detection; robot; LOCATION;
D O I
10.1109/TIE.2022.3174274
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In this article, a novel grasp detection network, efficient grasp detection network (EGNet), is proposed to deal with the grasp challenging in stacked scenes, which complete the tasks of the object detection, grasp detection, and manipulation relationship detection. On the object detection, the EGNet takes the idea from the EfficientDet, and some hyperparameters are modified to help the robot complete the task of object detection and classification. In the part of grasping detection, a novel grasp detection module is proposed, which takes the feature map from bidirectional feature pyramid network (BiFPN) as input, and outputs the grasp position and its quality score. In the part of manipulation relation analysis, it takes the feature map from BiFPN, object detection, and the grasp detection, and outputs the best grasping position and appropriate manipulation relationship. The EGNet is trained and tested on the visual manipulation relationship dataset and Cornell dataset, and the detection accuracy are 87.1% and 98.9%, respectively. Finally, the EGNet is also tested in the practical scene by a grasp experiment on the Baxter robot. The grasp experiment is performed in the cluttered and stacked scene, and gets the success rate of 93.6% and 69.6%, respectively.
引用
收藏
页码:4058 / 4067
页数:10
相关论文
共 50 条
  • [21] A novel vision-based multi-task robotic grasp detection method for multi-object scenes
    Yanan Song
    Liang Gao
    Xinyu Li
    Weiming Shen
    Kunkun Peng
    Science China Information Sciences, 2022, 65
  • [22] A novel vision-based multi-task robotic grasp detection method for multi-object scenes
    Song, Yanan
    Gao, Liang
    Li, Xinyu
    Shen, Weiming
    Peng, Kunkun
    SCIENCE CHINA-INFORMATION SCIENCES, 2022, 65 (12)
  • [23] ThinNet: An Efficient Convolutional Neural Network for Object Detection
    Cao, Sen
    Liu, Yazhou
    Zhou, Changxin
    Sun, Quansen
    Pongsak, Lasang
    Shen, Sheng Mei
    2018 24TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2018, : 836 - 841
  • [24] Efficient Selective Context Network for Accurate Object Detection
    Nie, Jing
    Pang, Yanwei
    Zhao, Shengjie
    Han, Jungong
    Li, Xuelong
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2021, 31 (09) : 3456 - 3468
  • [25] Efficient and Lightweight Neural Network for Hard Hat Detection
    He, Chenxi
    Tan, Shengbo
    Zhao, Jing
    Ergu, Daji
    Liu, Fangyao
    Ma, Bo
    Li, Jianjun
    ELECTRONICS, 2024, 13 (13)
  • [26] A new lightweight network for efficient UAV object detection
    Hua, Wei
    Chen, Qili
    Chen, Wenbai
    SCIENTIFIC REPORTS, 2024, 14 (01):
  • [27] Efficient sensory-grounded grasp pose quality mapping for gripper design and online grasp planning
    Eizicovits, Danny
    Berman, Sigal
    ROBOTICS AND AUTONOMOUS SYSTEMS, 2014, 62 (08) : 1208 - 1219
  • [28] An Efficient Robotic Pushing and Grasping Method in Cluttered Scene
    Yu, Sheng
    Zhai, Di-Hua
    Xia, Yuanqing
    Guan, Yuyin
    IEEE TRANSACTIONS ON CYBERNETICS, 2024, 54 (09) : 4889 - 4902
  • [29] AAGDN: Attention-Augmented Grasp Detection Network Based on Coordinate Attention and Effective Feature Fusion Method
    Zhou, Zhenning
    Zhu, Xiaoxiao
    Cao, Qixin
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2023, 8 (06) : 3462 - 3469
  • [30] Biologically Inspired Grasp Primitives for a Dexterous Robotic Hand to Catch and Lift a Sphere
    Lavery, John
    Kent, Ben
    Engeberg, Erik D.
    2012 12TH INTERNATIONAL CONFERENCE ON CONTROL, AUTOMATION AND SYSTEMS (ICCAS), 2012, : 1710 - 1715