Learning quasi-periodic robot motions from demonstration

被引:2
作者
Li, Xiao [1 ]
Cheng, Hongtai [1 ]
Chen, Heping [2 ]
Chen, Jiaming [3 ]
机构
[1] Northeastern Univ, Sch Mech Engn & Automat, Shenyang, Liaoning, Peoples R China
[2] Texas State Univ San Marcos, Ingram Sch Engn, San Marcos, TX 78666 USA
[3] Northeastern Univ, Sch Sino Dutch Biomed & Informat Engn, Shenyang, Liaoning, Peoples R China
基金
中国国家自然科学基金;
关键词
Quasi-periodic motion; Learning from demonstration; GMM; GMR; EMPIRICAL MODE DECOMPOSITION; FREE-FORM SURFACES; FRAMEWORK;
D O I
10.1007/s10514-019-09891-y
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The goal of Learning from Demonstration is to automatically transfer the skill knowledge from human to robot. Current researches focus on the problem of modeling aperiodic/periodic robot motions and extracting dynamic task parameters from the recorded sensory information. However, it is still not adequate for describing complex behaviors in an unstructured environment, such as searching for an unknown fitting position or painting/polishing an irregular surface. The quasi-periodic and stochastic properties cause a high demand for generalization ability of the modeling techniques. This paper proposes a systematic framework for learning quasi-periodic robot motions, which contains three steps: decomposition, modeling, and synthesization. Firstly FFT transform is performed to identify all the frequencies in the quasi-periodic motion. Then the motion is decomposed into an offset component, a series of harmonic and corresponding envelop components based on the concept of equivalent transformation. The offset component is extracted by Empirical Mode Decomposition, harmonic is separated by notch filter, and envelope component is extracted by Hilbert Transform. These components are either periodic or aperiodic. The aperiodic motions can be modeled by conventional techniques such as Gaussian Mixture Model and recovered by Gaussian Mixture Regression. The periodic motions are modeled in closed-form expressions. Finally, they are synthesized together to regenerate the robot motion. This modeling process captures both the aperiodicity and periodicity of a quasi-periodic motion. Simulation and experiment show that the proposed methods are feasible, effective and can predict robot motions beyond demonstrations. With this generalization ability, it is able to reduce the programming difficulty and demonstration complexity.
引用
收藏
页码:251 / 266
页数:16
相关论文
共 50 条
  • [31] Observation and removal of daily quasi-periodic components in soil radon data
    Baykut, S.
    Akgul, T.
    Inan, S.
    Seyis, C.
    RADIATION MEASUREMENTS, 2010, 45 (07) : 872 - 879
  • [32] Quasi-periodic Pulsations in the Most Powerful Solar Flare of Cycle 24
    Kolotkov, Dmitrii Y.
    Pugh, Chloe E.
    Broomhall, Anne-Marie
    Nakariakov, Valery M.
    ASTROPHYSICAL JOURNAL LETTERS, 2018, 858 (01)
  • [33] Confidence-Based Multi-Robot Learning from Demonstration
    Sonia Chernova
    Manuela Veloso
    International Journal of Social Robotics, 2010, 2 : 195 - 215
  • [34] Robot Grasp Planning: A Learning from Demonstration-Based Approach
    Wang, Kaimeng
    Fan, Yongxiang
    Sakuma, Ichiro
    SENSORS, 2024, 24 (02)
  • [35] A Robot Learning from Demonstration Method Based on Neural Network and Teleoperation
    Liang, Ke
    Wang, Yupeng
    Pan, Lei
    Tang, Yu
    Li, Jing
    Lin, Yizhong
    Pan, Mingzhang
    ARABIAN JOURNAL FOR SCIENCE AND ENGINEERING, 2024, 49 (02) : 1659 - 1672
  • [36] A robot learning from demonstration framework for skillful small parts assembly
    Haopeng Hu
    Xiansheng Yang
    Yunjiang Lou
    The International Journal of Advanced Manufacturing Technology, 2022, 119 : 6775 - 6787
  • [37] A robot learning from demonstration framework for skillful small parts assembly
    Hu, Haopeng
    Yang, Xiansheng
    Lou, Yunjiang
    INTERNATIONAL JOURNAL OF ADVANCED MANUFACTURING TECHNOLOGY, 2022, 119 (9-10) : 6775 - 6787
  • [38] Confidence-Based Multi-Robot Learning from Demonstration
    Chernova, Sonia
    Veloso, Manuela
    INTERNATIONAL JOURNAL OF SOCIAL ROBOTICS, 2010, 2 (02) : 195 - 215
  • [39] A Robot Learning from Demonstration Platform Based on Optical Motion Capture
    Yan, Hengyuan
    Zhou, Haiping
    Hu, Haopeng
    Lou, Yunjiang
    INTELLIGENT ROBOTICS AND APPLICATIONS, ICIRA 2021, PT II, 2021, 13014 : 100 - 110
  • [40] Adaptive motion planning framework by learning from demonstration
    Li, Xiao
    Cheng, Hongtai
    Liang, Xiaoxiao
    INDUSTRIAL ROBOT-THE INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH AND APPLICATION, 2019, 46 (04): : 541 - 552