Modular Control for Human Motion Analysis and Classification in Human-Robot Interaction

被引:0
|
作者
Alberto Rivera-Bautista, Juan [1 ]
Cristina Ramirez-Hernandez, Ana [1 ]
Garcia-Vega, Virginia A. [1 ]
Marin-Hernandez, Antonio [1 ]
机构
[1] Univ Veracruzana, Dept Inteligencia Artificial, Xalapa, Veracruz, Mexico
来源
PROCEEDINGS OF THE 5TH ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION (HRI 2010) | 2010年
关键词
human walking gestures; attitude interpretation; Human-Robot interaction; sensor fusion;
D O I
10.1145/1734454.1734527
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Trajectories followed by the humans can be interpreted as an attitude gesture. Based on this interpretation an autonomous mobile robot can decide how to initiate interaction with a given human. In this work is presented a modular control system to analyze human walking trajectories in order to engage a robot on a Human-Robot interaction. When the robot detects a human with their vision system a visual tracking module begins to work over the Pan/Tilt/Zoom (PTZ) camera unit. Camera parameters configuration and global robot localization are then used by another module to filter and track human's legs over the laser range finder (LRF) data. Path followed by the human over the global reference frame is then processed by another module which determines the kind of attitude showed by the human. Based on the result the robot decides if an interaction is needed and who is expected to begin it. At this moment are used only three kinds of attitudes: confidence, curiosity and nervousness.
引用
收藏
页码:169 / 170
页数:2
相关论文
共 50 条
  • [1] Motion Classification using IMU for Human-Robot Interaction
    Saktaweekulkit, Kawroong
    Maneewarn, Thavida
    INTERNATIONAL CONFERENCE ON CONTROL, AUTOMATION AND SYSTEMS (ICCAS 2010), 2010, : 2295 - 2299
  • [2] Interaction Task Motion Learning for Human-Robot Interaction Control
    Lyu, Shangke
    Selvaraj, Nithish Muthuchamy
    Cheah, Chien Chern
    IEEE TRANSACTIONS ON HUMAN-MACHINE SYSTEMS, 2022, 52 (05) : 894 - 906
  • [3] Reactive motion control for human-robot tactile interaction
    Wösch, T
    Feiten, W
    2002 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, VOLS I-IV, PROCEEDINGS, 2002, : 3807 - 3812
  • [4] Motion control and experimentation on physical human-robot interaction
    Xie, Guanghui
    Jin, Mina
    Wang, Guangjian
    Hashimoto, Minoru
    Yang, Zhiping
    Yingyong Jichu yu Gongcheng Kexue Xuebao/Journal of Basic Science and Engineering, 2014, 22 (05): : 1018 - 1029
  • [5] Neural Control for Human-Robot Interaction with Human Motion Intention Estimation
    Peng, Guangzhu
    Yang, Chenguang
    Chen, C. L. Philip
    IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS, 2024, 71 (12) : 16317 - 16326
  • [6] Human-robot interaction and robot control
    Sequeira, Joao
    Ribeiro, Maria Isabel
    ROBOT MOTION AND CONTROL: RECENT DEVELOPMENTS, 2006, 335 : 375 - 390
  • [7] Real-time human motion analysis for human-robot interaction
    Molina-Tanco, L
    Bandera, JP
    Marfil, R
    Sandoval, F
    2005 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS, VOLS 1-4, 2005, : 1808 - 1813
  • [8] Timing Control of Utterance and Body Motion in Human-Robot Interaction
    Namera, Kensaku
    Takasugi, Shoji
    Takano, Koji
    Yamamoto, Tomohito
    Miyake, Yoshihiro
    2008 17TH IEEE INTERNATIONAL SYMPOSIUM ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION, VOLS 1 AND 2, 2008, : 119 - +
  • [9] Motion Retargeting and Control for Teleoperated Physical Human-Robot Interaction
    Kaplish, Akshit
    Yamane, Katsu
    2019 IEEE-RAS 19TH INTERNATIONAL CONFERENCE ON HUMANOID ROBOTS (HUMANOIDS), 2019, : 723 - 730
  • [10] Evolutionary Motion Control Optimization in Physical Human-Robot Interaction
    Nadeau, Nicholas A.
    Bonev, Ilian A.
    2018 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2018, : 1347 - 1353