Cooperative Human-Robot Interaction based on Pointing Gesture in Informationally Structured Space

被引:0
作者
Obo, Takenori [1 ]
Kawabata, Ryosuke [1 ]
Kubota, Naoyuki [2 ]
机构
[1] Tokyo Polytech Univ, Dept Appl Comp Sci, Atsugi, Kanagawa, Japan
[2] Tokyo Metropolitan Univ, Dept Syst Design, Tokyo, Japan
来源
2018 WORLD AUTOMATION CONGRESS (WAC) | 2018年
关键词
Human-Robot Interaction; Communication Robot; Gesture Recognition; Feature Extraction; Object Recognition;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Recently, various types of communication robots have been developed and become more familiar with us. Human-like conversation with gestures and verbal cues makes a contribution to provide more natural communication. In this study, we present an approach for cooperative interaction between human and robot. Pointing gesture is an important measure to share own cognitive environment with others. Pointing is described as a special gesture functionally in that directing someone's attention to something does not convey a specific meaning in the manner of most conventionalized, symbolic gestures. This paper proposes an online learning structure to model a correlation between pointing gesture and verbal cues in human-robot interaction.
引用
收藏
页码:103 / 108
页数:6
相关论文
共 50 条
  • [31] Real-time vision based gesture recognition for human-robot interaction
    Hong, Seok-ju
    Setiawan, Nurul Arif
    Lee, Chil-woo
    KNOWLEDGE-BASED INTELLIGENT INFORMATION AND ENGINEERING SYSTEMS: KES 2007 - WIRN 2007, PT I, PROCEEDINGS, 2007, 4692 : 493 - +
  • [32] Human behavior and hand gesture classification for smart human-robot interaction
    Mendes, Nuno
    Ferrer, Joao
    Vitorino, Joao
    Safeea, Mohammad
    Neto, Pedro
    27TH INTERNATIONAL CONFERENCE ON FLEXIBLE AUTOMATION AND INTELLIGENT MANUFACTURING, FAIM2017, 2017, 11 : 91 - 98
  • [33] A Robust Myoelectric Gesture Recognition Method for Enhancing the Reliability of Human-Robot Interaction
    Wang, Long
    Chen, Zhangyi
    Zhou, Shanjun
    Yu, Yilin
    Li, Xiaoling
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2025, 10 (04): : 3731 - 3738
  • [34] Multimodal Communication for Human-Friendly Robot Partners in Informationally Structured Space
    Kubota, Naoyuki
    Toda, Yuichiro
    IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART C-APPLICATIONS AND REVIEWS, 2012, 42 (06): : 1142 - 1151
  • [35] Virtual Reality Applications for Enhancing Human-Robot Interaction: A Gesture Recognition Perspective
    Sabbella, Sandeep Reddy
    Kaszuba, Sara
    Leotta, Francesco
    Nardi, Daniele
    PROCEEDINGS OF THE 23RD ACM INTERNATIONAL CONFERENCE ON INTELLIGENT VIRTUAL AGENTS, IVA 2023, 2023,
  • [36] Diver's hand gesture recognition and segmentation for human-robot interaction on AUV
    Jiang, Yu
    Zhao, Minghao
    Wang, Chong
    Wei, Fenglin
    Wang, Kai
    Qi, Hong
    SIGNAL IMAGE AND VIDEO PROCESSING, 2021, 15 (08) : 1899 - 1906
  • [37] Speech to Head Gesture Mapping in Multimodal Human-Robot Interaction
    Aly, Amir
    Tapus, Adriana
    SERVICE ORIENTATION IN HOLONIC AND MULTI-AGENT MANUFACTURING CONTROL, 2012, 402 : 183 - 196
  • [38] Vision based gesture recognition for human-robot symbiosis
    Bhuiyan, Md. Al-Amin
    Islam, Md. Ezharul
    Begum, Nasima
    Hasanuzzaman, Md.
    Liu, Chang Hong
    Ueno, Haruki
    PROCEEDINGS OF 10TH INTERNATIONAL CONFERENCE ON COMPUTER AND INFORMATION TECHNOLOGY (ICCIT 2007), 2007, : 418 - +
  • [39] A Gesture-Based Natural Human-Robot Interaction Interface With Unrestricted Force Feedback
    Liang, Yinhao
    Du, Guanglong
    Li, Chunquan
    Chen, Chuxin
    Wang, Xueqian
    Liu, Peter X.
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2022, 71
  • [40] Computer vision-based hand gesture recognition for human-robot interaction: a review
    Qi, Jing
    Ma, Li
    Cui, Zhenchao
    Yu, Yushu
    COMPLEX & INTELLIGENT SYSTEMS, 2024, 10 (01) : 1581 - 1606