A Robot Learning from Demonstration Platform Based on Optical Motion Capture

被引:0
作者
Yan, Hengyuan [1 ]
Zhou, Haiping [2 ,3 ]
Hu, Haopeng [1 ]
Lou, Yunjiang [1 ]
机构
[1] Harbin Inst Technol Shenzhen, Shenzhen 518000, Peoples R China
[2] Beijing Inst Precis Mechatron & Controls, Beijing, Peoples R China
[3] Lab Aerosp Servo Actuat & Transmiss, Beijing, Peoples R China
来源
INTELLIGENT ROBOTICS AND APPLICATIONS, ICIRA 2021, PT II | 2021年 / 13014卷
关键词
Learning from demonstration; Motion capture; Robot learning;
D O I
10.1007/978-3-030-89098-8_10
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Motion capture (MoCap) is the technology of capturing the movement of a target through sensors such as optical equipment or inertial measurement units. It is widely used in industrial fields. In this work, a robot learning from demonstration platform is established including motion capturing, data pre-processing, policy learning, and a robot controller. It takes the optical MoCap system as the sensor to acquire the motion data of the target. Since the data obtained through the MoCap system always suffer from problems such as noise and data loss, a data processing strategy that can be divided into data pre-processing and policy learning is proposed to obtain a smooth robot motion trajectory. Then the robot trajectory will be transmitted to the designed robot controller to drive a real robot. The proposed robot learning from demonstration platform, which is designed for the rapid deployment of robots in industrial production lines, is convenient for secondary development and enables non-robotics professionals to operate the robots easily.
引用
收藏
页码:100 / 110
页数:11
相关论文
共 15 条
[1]   A survey of robot learning from demonstration [J].
Argall, Brenna D. ;
Chernova, Sonia ;
Veloso, Manuela ;
Browning, Brett .
ROBOTICS AND AUTONOMOUS SYSTEMS, 2009, 57 (05) :469-483
[2]  
Cao ZQ, 2019, 2019 WORLD ROBOT CONFERENCE SYMPOSIUM ON ADVANCED ROBOTICS AND AUTOMATION (WRC SARA 2019), P13, DOI [10.1109/wrc-sara.2019.8931930, 10.1109/WRC-SARA.2019.8931930]
[3]   Natural Human-Robot Interface Using Adaptive Tracking System with the Unscented Kalman Filter [J].
Du, Guanglong ;
Yao, Gengcheng ;
Li, Chunquan ;
Liu, Peter X. .
IEEE TRANSACTIONS ON HUMAN-MACHINE SYSTEMS, 2020, 50 (01) :42-54
[4]   Stereo-based real-time 6-DoF work tool tracking for robot programing by demonstration [J].
Ferreira, Marcos ;
Costa, Paulo ;
Rocha, Luis ;
Paulo Moreira, A. .
INTERNATIONAL JOURNAL OF ADVANCED MANUFACTURING TECHNOLOGY, 2016, 85 (1-4) :57-69
[5]   Performance Evaluation of Optical Motion Capture Sensors for Assembly Motion Capturing [J].
Hu, Haopeng ;
Cao, Zhiqi ;
Yang, Xiansheng ;
Xiong, Hao ;
Lou, Yunjiang .
IEEE ACCESS, 2021, 9 :61444-61454
[6]  
Huang YL, 2019, IEEE INT CONF ROBOT, P2531, DOI [10.1109/ICRA.2019.8793540, 10.1109/icra.2019.8793540]
[7]   Motion Capture Technology in Industrial Applications: A Systematic Review [J].
Menolotto, Matteo ;
Komaris, Dimitrios-Sokratis ;
Tedesco, Salvatore ;
O'Flynn, Brendan ;
Walsh, Michael .
SENSORS, 2020, 20 (19) :1-25
[8]   Recent Advances in Robot Learning from Demonstration [J].
Ravichandar, Harish ;
Polydoros, Athanasios S. ;
Chernova, Sonia ;
Billard, Aude .
ANNUAL REVIEW OF CONTROL, ROBOTICS, AND AUTONOMOUS SYSTEMS, VOL 3, 2020, 2020, 3 :297-330
[9]   Learning Controllers for Reactive and Proactive Behaviors in Human-Robot Collaboration [J].
Rozo, Leonel ;
Silverio, Joao ;
Calinon, Sylvain ;
Caldwell, Darwin G. .
FRONTIERS IN ROBOTICS AND AI, 2016, 3
[10]   Many regression algorithms, one unified model: A review [J].
Stulp, Freek ;
Sigaud, Olivier .
NEURAL NETWORKS, 2015, 69 :60-79