On Exploiting Haptic Cues for Self-Supervised Learning of Depth-Based Robot Navigation Affordances

被引:0
作者
José Baleia
Pedro Santana
José Barata
机构
[1] ISCTE - Instituto Universitário de Lisboa,Portugal Instituto de Telecomunicações
[2] Portugal Universidade Nova de Lisboa,CTS
来源
Journal of Intelligent & Robotic Systems | 2015年 / 80卷
关键词
Autonomous robots; Self-supervised learning; Affordances; Terrain assessment; Depth sensing; Tactile sensing;
D O I
暂无
中图分类号
学科分类号
摘要
This article presents a method for online learning of robot navigation affordances from spatiotemporally correlated haptic and depth cues. The method allows the robot to incrementally learn which objects present in the environment are actually traversable. This is a critical requirement for any wheeled robot performing in natural environments, in which the inability to discern vegetation from non-traversable obstacles frequently hampers terrain progression. A wheeled robot prototype was developed in order to experimentally validate the proposed method. The robot prototype obtains haptic and depth sensory feedback from a pan-tilt telescopic antenna and from a structured light sensor, respectively. With the presented method, the robot learns a mapping between objects’ descriptors, given the range data provided by the sensor, and objects’ stiffness, as estimated from the interaction between the antenna and the object. Learning confidence estimation is considered in order to progressively reduce the number of required physical interactions with acquainted objects. To raise the number of meaningful interactions per object under time pressure, the several segments of the object under analysis are prioritised according to a set of morphological criteria. Field trials show the ability of the robot to progressively learn which elements of the environment are traversable.
引用
收藏
页码:455 / 474
页数:19
相关论文
共 151 条
[1]  
Aloimonos J(1988)Active vision Int. J. Comput. Vis. 1 333-356
[2]  
Weiss I(2010)Adaptive cancelation of self-generated sensory signals in a whisking robot IEEE Trans. Robot. 26 1065-1076
[3]  
Bandyopadhyay A(2013)Rapid characterization of vegetation structure with a microsoft kinect sensor Sensors 13 2384-2398
[4]  
Anderson SR(1988)Active perception Proc. IEEE 76 996-1005
[5]  
Pearson MJ(2009)Autonomous off-road navigation with end-to-end learning for the lagr program J. Field Robot. 26 3-25
[6]  
Pipe A(1991)Animate vision Artif. Intell. 48 57-86
[7]  
Prescott T(2012)Robots for environmental monitoring: Significant advancements and applications Robot. Autom. Mag. IEEE 19 24-39
[8]  
Dean P(2006)On the influence of morphology of tactile sensors for behavior and control Robot. Auton. Syst. 54 686-695
[9]  
Porrill J(1981)Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography Commun. ACM 24 381-395
[10]  
Azzari G(1989)Pose estimation from corresponding point data. IEEE Transactions on Systems Man Cybern. 19 1426-1446