Single-Grasp Object Classification and Feature Extraction with Simple Robot Hands and Tactile Sensors

被引:128
作者
Spiers, Adam J. [1 ]
Liarokapis, Minas V. [1 ]
Calli, Berk [1 ]
Dollar, Aaron M. [1 ]
机构
[1] Yale Univ, Dept Mech Engn & Mat Sci, GRAB Lab, New Haven, CT 06511 USA
基金
美国国家科学基金会;
关键词
Tactile sensing; object classification; object feature extraction; underactuated robot hands; machine learning; adaptive grasping; robotics; haptics applications; HAPTIC IDENTIFICATION; EXPLORATION;
D O I
10.1109/TOH.2016.2521378
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Classical robotic approaches to tactile object identification often involve rigid mechanical grippers, dense sensor arrays, and exploratory procedures (EPs). Though EPs are a natural method for humans to acquire object information, evidence also exists for meaningful tactile property inference from brief, non-exploratory motions (a 'haptic glance'). In this work, we implement tactile object identification and feature extraction techniques on data acquired during a single, unplanned grasp with a simple, underactuated robot hand equipped with inexpensive barometric pressure sensors. Our methodology utilizes two cooperating schemes based on an advanced machine learning technique (random forests) and parametric methods that estimate object properties. The available data is limited to actuator positions (one per two link finger) and force sensors values (eight per finger). The schemes are able to work both independently and collaboratively, depending on the task scenario. When collaborating, the results of each method contribute to the other, improving the overall result in a synergistic fashion. Unlike prior work, the proposed approach does not require object exploration, re-grasping, grasp-release, or force modulation and works for arbitrary object start positions and orientations. Due to these factors, the technique may be integrated into practical robotic grasping scenarios without adding time or manipulation overheads.
引用
收藏
页码:207 / 220
页数:14
相关论文
共 43 条
[1]  
Bandyopadhyaya I, 2014, IEEE INT ADV COMPUT, P1231, DOI 10.1109/IAdCC.2014.6779503
[2]   A compliant self-adaptive gripper with proprioceptive haptic feedback [J].
Belzile, Bruno ;
Birglen, Lionel .
AUTONOMOUS ROBOTS, 2014, 36 (1-2) :79-91
[3]  
Bierbaum Alexander, 2008, 2008 8th IEEE-RAS International Conference on Humanoid Robots (Humanoids 2008), P360, DOI 10.1109/ICHR.2008.4756005
[4]  
Birglen L., 2008, SPRINGER TRACTS ADV, P244
[5]   Random forests [J].
Breiman, L .
MACHINE LEARNING, 2001, 45 (01) :5-32
[6]  
Calli B., 2015, IEEE Robotics and Automation Magazine, V22, P36
[7]  
Chitta Sachin, 2010, 2010 IEEE International Conference on Robotics and Automation (ICRA 2010), P2342, DOI 10.1109/ROBOT.2010.5509923
[8]  
Chu V, 2013, IEEE INT CONF ROBOT, P3048, DOI 10.1109/ICRA.2013.6631000
[9]  
Dario P., 1992, P IEEE RSJ INT C INT, V3, P1896
[10]   Contact sensing and grasping performance of compliant hands [J].
Dollar, Aaron M. ;
Jentoft, Leif P. ;
Gao, Jason H. ;
Howe, Robert D. .
AUTONOMOUS ROBOTS, 2010, 28 (01) :65-75