Keyframe-based Learning from Demonstration Method and Evaluation

被引:135
|
作者
Akgun, Baris [1 ]
Cakmak, Maya [1 ]
Jiang, Karl [1 ]
Thomaz, Andrea L. [1 ]
机构
[1] Georgia Inst Technol, Sch Interact Comp, Atlanta, GA 30332 USA
关键词
Learning from Demonstration; Kinesthetic teaching; Human-Robot Interaction; Humanoid robotics; IMITATION; TASK;
D O I
10.1007/s12369-012-0160-0
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
We present a framework for learning skills from novel types of demonstrations that have been shown to be desirable from a Human-Robot Interaction perspective. Our approach-Keyframe-based Learning from Demonstration (KLfD)-takes demonstrations that consist of keyframes; a sparse set of points in the state space that produces the intended skill when visited in sequence. The conventional type of trajectory demonstrations or a hybrid of the two are also handled by KLfD through a conversion to keyframes. Our method produces a skill model that consists of an ordered set of keyframe clusters, which we call Sequential Pose Distributions (SPD). The skill is reproduced by splining between clusters. We present results from two domains: mouse gestures in 2D and scooping, pouring and placing skills on a humanoid robot. KLfD has performance similar to existing LfD techniques when applied to conventional trajectory demonstrations. Additionally, we demonstrate that KLfD may be preferable when demonstration type is suited for the skill.
引用
收藏
页码:343 / 355
页数:13
相关论文
共 50 条
  • [31] Toward Generalization of Bipedal Gait Cycle During Stair Climbing Using Learning From Demonstration
    Goldfarb, Nathaniel
    Bales, Charles
    Fischer, Gregory S.
    IEEE TRANSACTIONS ON MEDICAL ROBOTICS AND BIONICS, 2021, 3 (02): : 446 - 454
  • [32] Confidence-Based Multi-Robot Learning from Demonstration
    Sonia Chernova
    Manuela Veloso
    International Journal of Social Robotics, 2010, 2 : 195 - 215
  • [33] Robot Grasp Planning: A Learning from Demonstration-Based Approach
    Wang, Kaimeng
    Fan, Yongxiang
    Sakuma, Ichiro
    SENSORS, 2024, 24 (02)
  • [34] Learning from Demonstration Based on a Classification of Task Parameters and Trajectory Optimization
    Vidakovic, Josip
    Jerbic, Bojan
    Sekoranja, Bojan
    Svaco, Marko
    Suligoj, Filip
    JOURNAL OF INTELLIGENT & ROBOTIC SYSTEMS, 2020, 99 (02) : 261 - 275
  • [35] Motion Planning With Success Judgement Model Based on Learning From Demonstration
    Furuta, Daichi
    Kutsuzawa, Kyo
    Sakaino, Sho
    Tsuji, Toshiaki
    IEEE ACCESS, 2020, 8 : 73142 - 73150
  • [36] Learning from Demonstration Based on a Classification of Task Parameters and Trajectory Optimization
    Josip Vidaković
    Bojan Jerbić
    Bojan Šekoranja
    Marko Švaco
    Filip Šuligoj
    Journal of Intelligent & Robotic Systems, 2020, 99 : 261 - 275
  • [37] Towards Goal Based Architecture Design for Learning High-Level Representation of Behaviors from Demonstration
    Fonooni, Benjamin
    Hellstrom, Thomas
    Janlert, Lars-Erik
    2013 IEEE INTERNATIONAL MULTI-DISCIPLINARY CONFERENCE ON COGNITIVE METHODS IN SITUATION AWARENESS AND DECISION SUPPORT (COGSIMA), 2013, : 67 - 74
  • [38] A Robot Learning from Demonstration Platform Based on Optical Motion Capture
    Yan, Hengyuan
    Zhou, Haiping
    Hu, Haopeng
    Lou, Yunjiang
    INTELLIGENT ROBOTICS AND APPLICATIONS, ICIRA 2021, PT II, 2021, 13014 : 100 - 110
  • [39] Vision-Based Learning from Demonstration System for Robot Arms
    Hwang, Pin-Jui
    Hsu, Chen-Chien
    Chou, Po-Yung
    Wang, Wei-Yen
    Lin, Cheng-Hung
    SENSORS, 2022, 22 (07)
  • [40] A Practical Comparison of Three Robot Learning from Demonstration Algorithm
    Halit Bener Suay
    Russell Toris
    Sonia Chernova
    International Journal of Social Robotics, 2012, 4 : 319 - 330