Learning quasi-periodic robot motions from demonstration

被引:2
作者
Li, Xiao [1 ]
Cheng, Hongtai [1 ]
Chen, Heping [2 ]
Chen, Jiaming [3 ]
机构
[1] Northeastern Univ, Sch Mech Engn & Automat, Shenyang, Liaoning, Peoples R China
[2] Texas State Univ San Marcos, Ingram Sch Engn, San Marcos, TX 78666 USA
[3] Northeastern Univ, Sch Sino Dutch Biomed & Informat Engn, Shenyang, Liaoning, Peoples R China
基金
中国国家自然科学基金;
关键词
Quasi-periodic motion; Learning from demonstration; GMM; GMR; EMPIRICAL MODE DECOMPOSITION; FREE-FORM SURFACES; FRAMEWORK;
D O I
10.1007/s10514-019-09891-y
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The goal of Learning from Demonstration is to automatically transfer the skill knowledge from human to robot. Current researches focus on the problem of modeling aperiodic/periodic robot motions and extracting dynamic task parameters from the recorded sensory information. However, it is still not adequate for describing complex behaviors in an unstructured environment, such as searching for an unknown fitting position or painting/polishing an irregular surface. The quasi-periodic and stochastic properties cause a high demand for generalization ability of the modeling techniques. This paper proposes a systematic framework for learning quasi-periodic robot motions, which contains three steps: decomposition, modeling, and synthesization. Firstly FFT transform is performed to identify all the frequencies in the quasi-periodic motion. Then the motion is decomposed into an offset component, a series of harmonic and corresponding envelop components based on the concept of equivalent transformation. The offset component is extracted by Empirical Mode Decomposition, harmonic is separated by notch filter, and envelope component is extracted by Hilbert Transform. These components are either periodic or aperiodic. The aperiodic motions can be modeled by conventional techniques such as Gaussian Mixture Model and recovered by Gaussian Mixture Regression. The periodic motions are modeled in closed-form expressions. Finally, they are synthesized together to regenerate the robot motion. This modeling process captures both the aperiodicity and periodicity of a quasi-periodic motion. Simulation and experiment show that the proposed methods are feasible, effective and can predict robot motions beyond demonstrations. With this generalization ability, it is able to reduce the programming difficulty and demonstration complexity.
引用
收藏
页码:251 / 266
页数:16
相关论文
共 50 条
  • [41] Vision-Based Learning from Demonstration System for Robot Arms
    Hwang, Pin-Jui
    Hsu, Chen-Chien
    Chou, Po-Yung
    Wang, Wei-Yen
    Lin, Cheng-Hung
    SENSORS, 2022, 22 (07)
  • [42] Human and Robot Perception in Large-scale Learning from Demonstration
    Crick, Christopher
    Osentoski, Sarah
    Jay, Graylin
    Jenkins, Odest Chadwicke
    PROCEEDINGS OF THE 6TH ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTIONS (HRI 2011), 2011, : 339 - 346
  • [43] Robot Learning From Demonstration for Assembly With Sequential Assembly Movement Primitives
    Hu, Haopeng
    Yan, Hengyuan
    Yang, Xiansheng
    Lou, Yunjiang
    IEEE-ASME TRANSACTIONS ON MECHATRONICS, 2024, 29 (04) : 2685 - 2696
  • [44] A Robot Learning from Demonstration Method Based on Neural Network and Teleoperation
    Ke Liang
    Yupeng Wang
    Lei Pan
    Yu Tang
    Jing Li
    Yizhong Lin
    Mingzhang Pan
    Arabian Journal for Science and Engineering, 2024, 49 : 1659 - 1672
  • [45] Quasi-Periodic Motion of a Strip During Rolling With 1:6 Internal Resonance
    Zhou, Wentao
    Liu, Zeliang
    Li, Huijian
    Tao, Chenglin
    Zhou, Xin
    SHOCK AND VIBRATION, 2025, 2025 (01)
  • [46] Quantifying Demonstration Quality for Robot Learning and Generalization
    Sakr, Maram
    Li, Zexi Jesse
    van der Loos, H. F. Machiel
    Kulic, Dana
    Croft, Elizabeth A.
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2022, 7 (04): : 9659 - 9666
  • [47] The Effects of a Robot's Performance on Human Teachers for Learning from Demonstration Tasks
    Hedlund, Erin
    Johnson, Michael
    Gombolay, Matthew
    2021 16TH ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION, HRI, 2021, : 207 - 215
  • [48] Learning from Demonstration Facilitates Human-Robot Collaborative Task Execution
    Koskinopoulou, Maria
    Piperakis, Stylimos
    Frahanias, Panos
    ELEVENTH ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN ROBOT INTERACTION (HRI'16), 2016, : 59 - 66
  • [49] ROBOT LEARNING FROM HUMAN DEMONSTRATION OF ACTIVITIES OF DAILY LIVING (ADL) TASKS
    Trivedi, Urvish
    Alqasemi, Redwan
    Dubey, Rajiv
    PROCEEDINGS OF ASME 2021 INTERNATIONAL MECHANICAL ENGINEERING CONGRESS AND EXPOSITION (IMECE2021), VOL 6, 2021,
  • [50] Robot Performing Peg-in-Hole Operations by Learning from Human Demonstration
    Zhu, Zuyuan
    Hu, Huosheng
    Gu, Dongbing
    2018 10TH COMPUTER SCIENCE AND ELECTRONIC ENGINEERING CONFERENCE (CEEC), 2018, : 30 - 35