Instance segmentation based 6D pose estimation of industrial objects using point clouds for robotic bin-picking

被引:28
作者
Zhuang, Chungang [1 ]
Li, Shaofei [1 ]
Ding, Han [1 ]
机构
[1] Shanghai Jiao Tong Univ, Sch Mech Engn, Shanghai 200240, Peoples R China
基金
中国国家自然科学基金;
关键词
Point cloud; Deep learning; Instance segmentation; Pose estimation; Robotic bin picking; RECOGNITION;
D O I
10.1016/j.rcim.2023.102541
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
3D object pose estimation for robotic grasping and manipulation is a crucial task in the manufacturing industry. In cluttered and occluded scenes, the 6D pose estimation of the low-textured or textureless industrial object is a challenging problem due to the lack of color information. Thus, point cloud that is hardly affected by the lighting conditions is gaining popularity as an alternative solution for pose estimation. This article proposes a deep learning-based pose estimation using point cloud as input, which consists of instance segmentation and instance point cloud pose estimation. The instance segmentation divides the scene point cloud into multiple instance point clouds, and each instance point cloud pose is accurately predicted by fusing the depth and normal feature maps. In order to reduce the time consumption of the dataset acquisition and annotation, a physically-simulated engine is constructed to generate the synthetic dataset. Finally, several experiments are conducted on the public, synthetic and real datasets to verify the effectiveness of the pose estimation network. The experimental results show that the point cloud based pose estimation network can effectively and robustly predict the poses of objects in cluttered and occluded scenes.
引用
收藏
页数:18
相关论文
共 50 条
  • [21] Segmentation based 6D pose estimation using integrated shape pattern and RGB information
    Gu, Chaochen
    Feng, Qi
    Lu, Changsheng
    Zhao, Shuxin
    Xu, Rui
    PATTERN ANALYSIS AND APPLICATIONS, 2022, 25 (04) : 1055 - 1073
  • [22] 6D Object Pose Estimation using Few-Shot Instance Segmentation and 3D Matching
    Li, Wanyi
    Sun, Jia
    Luo, Yongkang
    Wang, Peng
    2019 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (IEEE SSCI 2019), 2019, : 1071 - 1077
  • [23] 6D Pose Estimation from Point Cloud Using an Improved Point Pair Features Method
    Wang, Haoyu
    Wang, Hesheng
    Zhuang, Chungang
    2021 7TH INTERNATIONAL CONFERENCE ON CONTROL, AUTOMATION AND ROBOTICS (ICCAR), 2021, : 280 - 284
  • [24] 6D Pose Estimation of Transparent Objects Using Synthetic Data
    Byambaa, Munkhtulga
    Koutaki, Gou
    Choimaa, Lodoiravsal
    FRONTIERS OF COMPUTER VISION (IW-FCV 2022), 2022, 1578 : 3 - 17
  • [25] Point Pair Feature-Based Pose Estimation with Multiple Edge Appearance Models (PPF-MEAM) for Robotic Bin Picking
    Liu, Diyi
    Arai, Shogo
    Miao, Jiaqi
    Kinugawa, Jun
    Wang, Zhao
    Kosuge, Kazuhiro
    SENSORS, 2018, 18 (08)
  • [26] 3D object recognition and pose estimation for random bin-picking using Partition Viewpoint Feature Histograms
    Li, Deping
    Liu, Ning
    Guo, Yulan
    Wang, Xiaoming
    Xu, Jin
    PATTERN RECOGNITION LETTERS, 2019, 128 : 148 - 154
  • [27] Fast and precise 6D pose estimation of textureless objects using the point cloud and gray image
    Pan, Wang
    Zhu, Feng
    Hao, Yingming
    Zhang, Limin
    APPLIED OPTICS, 2018, 57 (28) : 8154 - 8165
  • [28] Deep object 6-DoF pose estimation using instance segmentation
    Pujolle, Victor
    Hayashi, Eiji
    PROCEEDINGS OF THE 2020 INTERNATIONAL CONFERENCE ON ARTIFICIAL LIFE AND ROBOTICS (ICAROB2020), 2020, : 241 - 244
  • [29] 6D Pose Estimation Using an Improved Method Based on Point Pair Features
    Vidal, Joel
    Lin, Chyi-Yeu
    Marti, Robert
    CONFERENCE PROCEEDINGS OF 2018 4TH INTERNATIONAL CONFERENCE ON CONTROL, AUTOMATION AND ROBOTICS (ICCAR), 2018, : 405 - 409
  • [30] Visual Positioning and Picking Pose Estimation of Tomato Clusters Based on Instance Segmentation
    Zhang Q.
    Pang Y.
    Li B.
    Nongye Jixie Xuebao/Transactions of the Chinese Society for Agricultural Machinery, 2023, 54 (10): : 205 - 215