Tactile Transfer Learning and Object Recognition With a Multifingered Hand Using Morphology Specific Convolutional Neural Networks

被引:11
作者
Funabashi, Satoshi [1 ]
Yan, Gang [2 ]
Fei, Hongyi [2 ]
Schmitz, Alexander [2 ]
Jamone, Lorenzo [3 ]
Ogata, Tetsuya [4 ]
Sugano, Shigeki [2 ]
机构
[1] Waseda Univ, Inst AI Robot, Future Robot Org, Tokyo 1698555, Japan
[2] Waseda Univ, Dept Modern Mech Engn, Tokyo 1698555, Japan
[3] Queen Mary Univ London, Sch Elect Engn & Comp Sci, London E1 4NS, England
[4] Waseda Univ, Dept Intermedia Art & Sci, Tokyo 1698555, Japan
基金
日本科学技术振兴机构;
关键词
Robot sensing systems; Tactile sensors; Task analysis; Object recognition; Transfer learning; Shape; Convolutional neural networks; Convolutional neural network (CNN); multifingered hand; object recognition; tactile sensing; SENSORS; COST; SKIN;
D O I
10.1109/TNNLS.2022.3215723
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Multifingered robot hands can be extremely effective in physically exploring and recognizing objects, especially if they are extensively covered with distributed tactile sensors. Convolutional neural networks (CNNs) have been proven successful in processing high dimensional data, such as camera images, and are, therefore, very well suited to analyze distributed tactile information as well. However, a major challenge is to organize tactile inputs coming from different locations on the hand in a coherent structure that could leverage the computational properties of the CNN. Therefore, we introduce a morphology-specific CNN (MS-CNN), in which hierarchical convolutional layers are formed following the physical configuration of the tactile sensors on the robot. We equipped a four-fingered Allegro robot hand with several uSkin tactile sensors; overall, the hand is covered with 240 sensitive elements, each one measuring three-axis contact force. The MS-CNN layers process the tactile data hierarchically: at the level of small local clusters first, then each finger, and then the entire hand. We show experimentally that, after training, the robot hand can successfully recognize objects by a single touch, with a recognition rate of over 95%. Interestingly, the learned MS-CNN representation transfers well to novel tasks: by adding a limited amount of data about new objects, the network can recognize nine types of physical properties.
引用
收藏
页码:7587 / 7601
页数:15
相关论文
共 49 条
[31]   Object Recognition Using Tactile Measurements: Kernel Sparse Coding Methods [J].
Liu, Huaping ;
Guo, Di ;
Sun, Fuchun .
IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2016, 65 (03) :656-665
[32]   Active tactile exploration with uncertainty and travel cost for fast shape estimation of unknown objects [J].
Matsubara, Takamitsu ;
Shibata, Kotaro .
ROBOTICS AND AUTONOMOUS SYSTEMS, 2017, 91 :314-326
[33]   Tactile Convolutional Networks for Online Slip and Rotation Detection [J].
Meier, Martin ;
Patzelt, Florian ;
Haschke, Robert ;
Ritter, Helge J. .
ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2016, PT II, 2016, 9887 :12-19
[34]  
Narita T, 2020, IEEE INT CONF ROBOT, P531, DOI [10.1109/ICRA40945.2020.9196615, 10.1109/icra40945.2020.9196615]
[35]   Few-Shot Image Recognition with Knowledge Transfer [J].
Peng, Zhimao ;
Li, Zechao ;
Zhang, Junge ;
Li, Yan ;
Qi, Guo-Jun ;
Tang, Jinhui .
2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2019), 2019, :441-449
[36]   Edge-orientation processing in first-order tactile neurons [J].
Pruszynski, J. Andrew ;
Johansson, Roland S. .
NATURE NEUROSCIENCE, 2014, 17 (10) :1404-1409
[37]  
Romero B, 2020, IEEE INT CONF ROBOT, P4796, DOI [10.1109/ICRA40945.2020.9196909, 10.1109/icra40945.2020.9196909]
[38]  
Sferrazza C, 2019, IEEE INT C INT ROBOT, P7961, DOI [10.1109/IROS40897.2019.8967571, 10.1109/iros40897.2019.8967571]
[39]   Single-Grasp Object Classification and Feature Extraction with Simple Robot Hands and Tactile Sensors [J].
Spiers, Adam J. ;
Liarokapis, Minas V. ;
Calli, Berk ;
Dollar, Aaron M. .
IEEE TRANSACTIONS ON HAPTICS, 2016, 9 (02) :207-220
[40]   In-Hand Object-Dynamics Inference Using Tactile Fingertips [J].
Sundaralingam, Balakumar ;
Hermans, Tucker .
IEEE TRANSACTIONS ON ROBOTICS, 2021, 37 (04) :1115-1126