Stereoscopic Vision System for Human Gesture Tracking and Robot Programming by Demonstration

被引:0
作者
Ferreira, Marcos
Rocha, Luis
Costa, Paulo
Moreira, A. Paulo
机构
来源
ROBOTICS IN SMART MANUFACTURING | 2013年 / 371卷
关键词
Programming-by-demonstration; motion tracking; artificial vision; skill-transfer; robotics; industrial manipulators;
D O I
暂无
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
This paper presents a framework for robot programming by demonstration using gesture. It is based on a luminous multi-LED marker which is captured by a pair of industrial cameras. Using stereoscopy the marker supplies a complete 6-DoF human gesture tracking output with both position and orientation. Tests show that the developed setup is industrial grade, being precise for many industrial applications and robust particularly to lighting conditions. Attaching the marker to an operator work tool provides an efficient way to track the human movements without further intrusion in the process. The resulting path is used to generate a program for an industrial manipulator ending the cycle in an human-robot skill transfer framework.
引用
收藏
页码:82 / 90
页数:9
相关论文
共 50 条
  • [41] Telepresence Robot with Image-based Face Tracking and 3D Perception with Human Gesture Interface using Kinect Sensor
    Berri, Rafael
    Wolf, Denis
    Osorio, Fernando
    2014 2ND BRAZILIAN ROBOTICS SYMPOSIUM (SBR) / 11TH LATIN AMERICAN ROBOTICS SYMPOSIUM (LARS) / 6TH ROBOCONTROL WORKSHOP ON APPLIED ROBOTICS AND AUTOMATION, 2014, : 205 - 210
  • [42] Skill based robot programming: Assembly, vision and Workspace Monitoring skill interaction
    Herrero, Hector
    Abou Moughlbay, Amine
    Luis Outon, Jose
    Salle, Damien
    Lopez de Ipina, Karmele
    NEUROCOMPUTING, 2017, 255 : 61 - 70
  • [43] Design and implementation of an omnidirectional vision system for robot perception
    Shi, Qing
    Li, Chang
    Wang, Chunbao
    Luo, Haibo
    Huang, Qiang
    Fukuda, Toshio
    MECHATRONICS, 2017, 41 : 58 - 66
  • [44] Computer vision methods for robot tasks: Motion detection, depth estimation and tracking
    Martinez-Martin, E.
    AI COMMUNICATIONS, 2012, 25 (04) : 373 - 375
  • [45] Coherence in One-Shot Gesture Recognition for Human-Robot Interaction
    Cabrera, Maria E.
    Voyles, Richard M.
    Wachs, Juan P.
    COMPANION OF THE 2018 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION (HRI'18), 2018, : 75 - 76
  • [46] Detection system for humanoid robots in semi-structured environments based on stereoscopic vision
    Herrera, Oscar
    Gonzalez, Yesenia
    Cortez, Paola
    Granados, Benito
    MEMORIA INVESTIGACIONES EN INGENIERIA, 2021, (21): : 94 - 107
  • [47] Robust One-Shot Robot Programming by Demonstration Using Entity-Based Resources
    Orendt, Eric M.
    Riedl, Michael
    Henrich, Dominik
    ADVANCES IN SERVICE AND INDUSTRIAL ROBOTICS, 2018, 49 : 573 - 582
  • [48] Trajectory Learning for Robot Programming by Demonstration Using Hidden Markov Model and Dynamic Time Warping
    Vakanski, Aleksandar
    Mantegh, Iraj
    Irish, Andrew
    Janabi-Sharifi, Farrokh
    IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICS, 2012, 42 (04): : 1039 - 1052
  • [49] Initial Investigation of Gravity and Friction Compensation of 2-DOF Robot Manipulator for Programming by Demonstration
    Kim, Min-Gyu
    Park, In-Gyu
    2015 12TH INTERNATIONAL CONFERENCE ON UBIQUITOUS ROBOTS AND AMBIENT INTELLIGENCE (URAI), 2015, : 433 - 434
  • [50] Tracking a System of Multiple Cameras on a Rotating Spherical Robot
    Hasbany, James
    DeJong, Brian P.
    Karadogan, Ernur
    Yelamarthi, Kumar
    Smith, Jonathan M.
    2017 IEEE SENSORS APPLICATIONS SYMPOSIUM (SAS), 2017,