A Vision-based Correction of Inertial Measurement of Human Motion for Robot Programming by Demonstration

被引:1
作者
Pellois, Robin [1 ]
Bruels, Olivier [1 ]
机构
[1] Univ Liege, Fac Appl Sci, Dept Aerosp & Mech Engn, Liege, Belgium
来源
INTERNATIONAL JOURNAL OF MECHANICAL ENGINEERING AND ROBOTICS RESEARCH | 2022年 / 11卷 / 06期
关键词
programming by demonstration; vision; inertial human motion tracking; merging;
D O I
10.18178/ijmerr.11.6.411-416
中图分类号
TH [机械、仪表工业];
学科分类号
0802 ;
摘要
We propose in this work an original approach consisting in correcting inertial human hand trajectory with vision-based object tracking in a context of programming by demonstration (PbD) of pick-and-place tasks. One challenge in PbD is to record human demonstrations accurately enough, in an easy way which does not limit human motion. Merging inertial-based and vision-based technologies may take advantage of both and fulfill the requirement of a PbD process. Our method is based on the identification of Positions of Interest (POIs) from object and hand data, corresponding to picking or placing actions. Then objects POIs are paired with hand POIs to modify the human hand trajectory. The method is implemented on a Sawyer robot with Xsens IMU sensors. Pick-and-place tasks with different complexity have been recorded and reproduced by the robot. The robot succeeds to reproduce the demonstrated tasks which validates our method.
引用
收藏
页码:411 / 416
页数:6
相关论文
共 50 条
  • [1] An inertial human upper limb motion tracking method for robot programming by demonstration
    Pellois, Robin
    Bruls, Olivier
    ROBOTICS AND AUTONOMOUS SYSTEMS, 2022, 156
  • [2] Industrial robot programming by demonstration using stereoscopic vision and inertial sensing
    de Souza, Joao Pedro C.
    Amorim, Antonio M.
    Rocha, Luis F.
    Pinto, Vitor H.
    Moreira, Antonio Paulo
    INDUSTRIAL ROBOT-THE INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH AND APPLICATION, 2022, 49 (01): : 96 - 107
  • [3] Stereoscopic Vision System for Human Gesture Tracking and Robot Programming by Demonstration
    Ferreira, Marcos
    Rocha, Luis
    Costa, Paulo
    Moreira, A. Paulo
    ROBOTICS IN SMART MANUFACTURING, 2013, 371 : 82 - 90
  • [4] Human motion caption with vision and inertial sensors for hand/arm robot teleoperation
    Kobayashi, Futoshi
    Kitabayashi, Keiichi
    Shimizu, Kai
    Nakamoto, Hiroyuki
    Kojima, Fumio
    INTERNATIONAL JOURNAL OF APPLIED ELECTROMAGNETICS AND MECHANICS, 2016, 52 (3-4) : 1629 - 1636
  • [5] Localization Using Vision-Based Robot
    Yun, Yeol-Min
    Yu, Ho-Yun
    Lee, Jang-Myung
    INTELLIGENT ROBOTICS AND APPLICATIONS, ICIRA 2014, PT II, 2014, 8918 : 285 - 289
  • [6] Vision-based Robot Manipulator for Industrial Applications
    Ali, Md. Hazrat
    Aizat, K.
    Yerkhan, K.
    Zhandos, T.
    Anuar, O.
    INTERNATIONAL CONFERENCE ON ROBOTICS AND SMART MANUFACTURING (ROSMA2018), 2018, 133 : 205 - 212
  • [7] Active learning for vision-based robot grasping
    Salganicoff, M
    Ungar, LH
    Bajcsy, R
    MACHINE LEARNING, 1996, 23 (2-3) : 251 - 278
  • [8] Vision-based calibration of a Hexa parallel robot
    Dehghani, Mehdi
    Ahmadi, Mahdi
    Khayatian, Alireza
    Eghtesad, Mohamad
    Yazdi, Mehran
    INDUSTRIAL ROBOT-THE INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH AND APPLICATION, 2014, 41 (03): : 296 - 310
  • [9] Vision-Based Biomechanical Markerless Motion Classification
    Liew Y.L.
    Chin J.F.
    Machine Graphics and Vision, 2023, 32 (01): : 3 - 24
  • [10] Vision-based Semantic Unscented FastSLAM for Mobile Robot
    Liu, Letian
    Zhu, Xiaorui
    Ou, Yongsheng
    2014 11TH WORLD CONGRESS ON INTELLIGENT CONTROL AND AUTOMATION (WCICA), 2014, : 1402 - 1408