Understanding and learning of gestures through human-robot interaction

被引:0
|
作者
Kuno, Y [1 ]
Murashima, T [1 ]
Shimada, N [1 ]
Shirai, Y [1 ]
机构
[1] Osaka Univ, Dept Comp Controlled Mech Syst, Suita, Osaka 5650871, Japan
关键词
D O I
暂无
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
Humans can communicate each other by gestures. Even if they cannot recognize them at first, they can understand each other through interaction. This paper presents a robot system with such capability. The robot detects its user by recognizing his/her face. It can accept his/her commands given by gestures. The user may use unknown gestures to the robot. If the robot does not respond to his/her gesture, the user usually iterates the same gesture. The robot detects this repetitive pattern as an intentional gesture by which the user wants to give it some order. Then it shows a little action according to the meaning of the gesture that the robot guesses. It observes the user's reaction to its action. If he/she continues the same pattern gesture, the robot considers that its understanding is right, completing the action. It also registers the pattern as a gesture of the guessed meaning. Otherwise, it iterates the same procedure by taking another action as a candidate of the meaning. We have implemented such interactive capability on our intelligent wheelchair It is convenient that we can make it come or go by gestures when we are off. Experimental results confirm that the proposed interaction method is useful in actual complex environments where even registered gestures cannot always be recognized.
引用
收藏
页码:2133 / 2138
页数:6
相关论文
共 50 条
  • [21] Human-Robot Interaction Based on Gaze Gestures for the Drone Teleoperation
    Yu, Mingxin
    Lin, Yingzi
    Schmidt, David
    Wang, Xiangzhou
    Wang, Yu
    JOURNAL OF EYE MOVEMENT RESEARCH, 2014, 7 (04):
  • [22] Human-Robot Interaction Using Three-Dimensional Gestures
    Ponmani, K.
    Sridharan, S.
    INTELLIGENT EMBEDDED SYSTEMS, ICNETS2, VOL II, 2018, 492 : 67 - 76
  • [23] Analysis of Task-Based Gestures in Human-Robot Interaction
    Haddadi, Amir
    Croft, Elizabeth A.
    Gleeson, Brian T.
    MacLean, Karon
    Alcazar, Javier
    2013 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2013, : 2146 - 2152
  • [24] HRI-Gestures: Gesture Recognition for Human-Robot Interaction
    Kollakidou, Avgi
    Haarslev, Frederik
    Odabasi, Cagatay
    Bodenhagen, Leon
    Krueger, Norbert
    PROCEEDINGS OF THE 17TH INTERNATIONAL JOINT CONFERENCE ON COMPUTER VISION, IMAGING AND COMPUTER GRAPHICS THEORY AND APPLICATIONS (VISAPP), VOL 5, 2022, : 559 - 566
  • [25] Robot-Generated Mixed Reality Gestures Improve Human-Robot Interaction
    Nhan Tran
    Grant, Trevor
    Thao Phung
    Hirshfield, Leanne
    Wickens, Christopher
    Williams, Tom
    SOCIAL ROBOTICS, ICSR 2021, 2021, 13086 : 768 - 773
  • [26] Prosody-Driven Robot Arm Gestures Generation in Human-Robot Interaction
    Aly, Amir
    Tapus, Adriana
    HRI'12: PROCEEDINGS OF THE SEVENTH ANNUAL ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION, 2012, : 257 - 258
  • [27] Continual Learning Through Human-Robot Interaction: Human Perceptions of a Continual Learning Robot in Repeated Interactions
    Ayub, Ali
    De Francesco, Zachary
    Holthaus, Patrick
    Nehaniv, Chrystopher L.
    Dautenhahn, Kerstin
    INTERNATIONAL JOURNAL OF SOCIAL ROBOTICS, 2025, 17 (02) : 277 - 296
  • [28] Exploring the Effect of Robot Hand Configurations in Directional Gestures for Human-Robot Interaction
    Sheikholeslami, Sara
    Moon, AJung
    Croft, Elizabeth A.
    2015 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2015, : 3594 - 3599
  • [29] Constructive learning for human-robot interaction
    Singh, Amarjot
    Karanam, Srikrishna
    Kumar, Devinder
    IEEE Potentials, 2013, 32 (04): : 13 - 19
  • [30] Online learning for human-robot interaction
    Raducanu, Bogdan
    Vitria, Jordi
    2007 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, VOLS 1-8, 2007, : 3342 - +