Where Shall I Touch? Vision-Guided Tactile Poking for Transparent Object Grasping

被引:14
|
作者
Jiang, Jiaqi [1 ,2 ]
Cao, Guanqun [1 ]
Butterworth, Aaron [1 ]
Thanh-Toan Do [3 ]
Luo, Shan [1 ,2 ]
机构
[1] Univ Liverpool, Dept Comp Sci, smARTLab, Liverpool L69 3BX, Merseyside, England
[2] Kings Coll London, Dept Engn, London WC2R 2LS, England
[3] Monash Univ, Dept Data Sci & AI, Clayton, Vic 3800, Australia
基金
英国工程与自然科学研究理事会;
关键词
Robot sensing systems; Grasping; Robots; Sensors; Cameras; Glass; Robot kinematics; Multimodal sensing; object segmentation; robot grasping and manipulation; tactile sensing; transparent objects; visual perception; GENERATION; PERCEPTION;
D O I
10.1109/TMECH.2022.3201057
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Picking up transparent objects is still a challenging task for robots. The visual properties of transparent objects such as reflection and refraction make the current grasping methods that rely on camera sensing fail to detect and localise them. However, humans can handle the transparent object well by first observing its coarse profile and then poking an area of interest to get a fine profile for grasping. Inspired by this, we propose a novel framework of vision-guided tactile poking for transparent objects grasping. In the proposed framework, a segmentation network is first used to predict the horizontal upper regions named as poking regions, where the robot can poke the object to obtain a good tactile reading while leading to minimal disturbance to the object's state. A poke is then performed with a high-resolution GelSight tactile sensor. Given the local profiles improved with the tactile reading, a heuristic grasp is planned for grasping the transparent object. To mitigate the limitations of real-world data collection and labelling for transparent objects, a large-scale realistic synthetic dataset was constructed. Extensive experiments demonstrate that our proposed segmentation network can predict the potential poking region with a high mean Average Precision (mAP) of 0.360, and the vision-guided tactile poking can enhance the grasping success rate significantly from 38.9% to 85.2%. Thanks to its simplicity, our proposed approach could also be adopted by other force or tactile sensors and could be used for grasping of other challenging objects. All the materials used in this paper are available at https://sites.google.com/view/tactilepoking.
引用
收藏
页码:233 / 244
页数:12
相关论文
共 27 条
  • [1] Vision-guided robotic grasping: Issues and experiments
    Smith, CE
    Papanikolopoulos, NP
    1996 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, PROCEEDINGS, VOLS 1-4, 1996, : 3203 - 3208
  • [2] Vision-guided grasping of unknown objects for service robots
    Sanz, PJ
    del Pobil, AP
    Inesta, JM
    Recatala, G
    1998 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, VOLS 1-4, 1998, : 3018 - 3025
  • [3] Autonomous vision-guided bi-manual grasping and manipulation
    Rastegarpanah, Alireza
    Marturi, Naresh
    Stolkin, Rustam
    2017 IEEE WORKSHOP ON ADVANCED ROBOTICS AND ITS SOCIAL IMPACTS, 2017,
  • [4] Vision-Guided Active Tactile Perception for Crack Detection and Reconstruction
    Jiang, Jiaqi
    Cao, Guanqun
    Gomes, Daniel Fernandes
    Luo, Shan
    2021 29TH MEDITERRANEAN CONFERENCE ON CONTROL AND AUTOMATION (MED), 2021, : 930 - 936
  • [5] Vision facilitates tactile perception when grasping an object
    Georgiana Juravle
    Francisco L. Colino
    Xhino Meleqi
    Gordon Binsted
    Alessandro Farnè
    Scientific Reports, 8
  • [6] Vision facilitates tactile perception when grasping an object
    Juravle, Georgiana
    Colino, Francisco L.
    Meleqi, Xhino
    Binsted, Gordon
    Farne, Alessandro
    SCIENTIFIC REPORTS, 2018, 8
  • [7] Issues and experimental results in vision-guided robotic grasping of static or moving objects
    Papanikolopoulos, N
    Smith, CE
    INDUSTRIAL ROBOT-THE INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH AND APPLICATION, 1998, 25 (02): : 134 - +
  • [8] An automatic machine vision-guided grasping system for Phalaenopsis tissue culture plantlets
    Huang, Ying-Jen
    Lee, Fang-Fan
    COMPUTERS AND ELECTRONICS IN AGRICULTURE, 2010, 70 (01) : 42 - 51
  • [9] A vision-guided object tracking and prediction algorithm for soccer robots
    Hong, CS
    Chun, SM
    Lee, JS
    Hong, KS
    1997 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION - PROCEEDINGS, VOLS 1-4, 1997, : 346 - 351
  • [10] Visual-Tactile Fusion for Transparent Object Grasping in Complex Backgrounds
    Li, Shoujie
    Yu, Haixin
    Ding, Wenbo
    Liu, Houde
    Ye, Linqi
    Xia, Chongkun
    Wang, Xueqian
    Zhang, Xiao-Ping
    IEEE TRANSACTIONS ON ROBOTICS, 2023, 39 (05) : 3838 - 3856