Grasping Pose Estimation for Robots Based on Convolutional Neural Networks

被引:3
|
作者
Zheng, Tianjiao [1 ,2 ]
Wang, Chengzhi [1 ]
Wan, Yanduo [1 ]
Zhao, Sikai [1 ]
Zhao, Jie [1 ]
Shan, Debin [2 ]
Zhu, Yanhe [1 ]
机构
[1] Harbin Inst Technol, State Key Lab Robot & Syst, Harbin 150001, Peoples R China
[2] Harbin Inst Technol, Sch Mat Sci & Engn, Harbin 150001, Peoples R China
基金
中国博士后科学基金; 国家重点研发计划;
关键词
robot grasping; pose estimation; convolutional neural network; deep learning; MODEL;
D O I
10.3390/machines11100974
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Robots gradually have the ability to plan grasping actions in unknown scenes by learning the manipulation of typical scenes. The grasping pose estimation method, as a kind of end-to-end method, has rapidly developed in recent years because of its good generalization. In this paper, we present a grasping pose estimation method for robots based on convolutional neural networks. In this method, a convolutional neural network model was employed, which can output the grasping success rate, approach angle, and gripper opening width for the input voxel. The grasping dataset was produced, and the model was trained in the physical simulator. A position optimization of the robotic grasping was proposed according to the distribution of the object centroid to improve the grasping success rate. An experimental platform for robot grasping was established, and 11 common everyday objects were selected for the experiments. Grasping experiments involving the eleven objects individually, multiple objects, as well as a dark environment without illumination, were performed. The results show that the method has the adaptability to grasp different geometric objects, including irregular shapes, and it is not influenced by lighting conditions. The total grasping success rate was 88.2% for the individual objects and 81.1% for the cluttered scene.
引用
收藏
页数:16
相关论文
共 50 条
  • [21] Head Pose Estimation Using Convolutional Neural Network
    Lee, Seungsu
    Saitoh, Takeshi
    IT CONVERGENCE AND SECURITY 2017, VOL 1, 2018, 449 : 164 - 171
  • [22] Head Pose Estimation Based on Multi-Scale Convolutional Neural Network
    Liang Lingyu
    Zhang Tiantian
    He Wei
    LASER & OPTOELECTRONICS PROGRESS, 2019, 56 (13)
  • [23] Deep Transfer Feature Based Convolutional Neural Forests for Head Pose Estimation
    Liu, Yuanyuan
    Xie, Zhong
    Gong, Xi
    Fang, Fang
    IMAGE AND VIDEO TECHNOLOGY (PSIVT 2017), 2018, 10799 : 5 - 16
  • [24] Multi-person pose estimation based on a deep convolutional neural network
    Duan, Peng
    Wang, Tingwei
    Cui, Maowei
    Sang, Hongyan
    Sun, Qun
    JOURNAL OF VISUAL COMMUNICATION AND IMAGE REPRESENTATION, 2019, 62 : 245 - 252
  • [25] Dance Action Recognition and Pose Estimation Based on Deep Convolutional Neural Network
    Zhu, Fengling
    Zhu, Ruichao
    TRAITEMENT DU SIGNAL, 2021, 38 (02) : 529 - 538
  • [26] 3D Human Pose Estimation Using Convolutional Neural Networks with 2D Pose Information
    Park, Sungheon
    Hwang, Jihye
    Kwak, Nojun
    COMPUTER VISION - ECCV 2016 WORKSHOPS, PT III, 2016, 9915 : 156 - 169
  • [27] Multi-Person Pose Estimation Using Group-Based Convolutional Neural Network Model
    Aonty, Shuhena Salam
    Deb, Kaushik
    Sarma, Moumita Sen
    Dhar, Pranab Kumar
    Shimamura, Tetsuya
    IEEE ACCESS, 2023, 11 : 42343 - 42360
  • [28] Uncertainty estimation and evaluation of deformation image registration based convolutional neural networks
    Rivetti, Luciano
    Studen, Andrej
    Sharma, Manju
    Chan, Jason
    Jeraj, Robert
    PHYSICS IN MEDICINE AND BIOLOGY, 2024, 69 (11)
  • [29] Real-time human pose estimation on a smart walker using convolutional neural networks
    Palermo, Manuel
    Moccia, Sara
    Migliorelli, Lucia
    Frontoni, Emanuele
    Santos, Cristina P.
    EXPERT SYSTEMS WITH APPLICATIONS, 2021, 184
  • [30] Smooth filtering identification based on convolutional neural networks
    Liu, Anan
    Zhao, Zhengyu
    Zhang, Chengqian
    Su, Yuting
    MULTIMEDIA TOOLS AND APPLICATIONS, 2019, 78 (19) : 26851 - 26865