Cooperative Human-Robot Interaction based on Pointing Gesture in Informationally Structured Space

被引:0
作者
Obo, Takenori [1 ]
Kawabata, Ryosuke [1 ]
Kubota, Naoyuki [2 ]
机构
[1] Tokyo Polytech Univ, Dept Appl Comp Sci, Atsugi, Kanagawa, Japan
[2] Tokyo Metropolitan Univ, Dept Syst Design, Tokyo, Japan
来源
2018 WORLD AUTOMATION CONGRESS (WAC) | 2018年
关键词
Human-Robot Interaction; Communication Robot; Gesture Recognition; Feature Extraction; Object Recognition;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Recently, various types of communication robots have been developed and become more familiar with us. Human-like conversation with gestures and verbal cues makes a contribution to provide more natural communication. In this study, we present an approach for cooperative interaction between human and robot. Pointing gesture is an important measure to share own cognitive environment with others. Pointing is described as a special gesture functionally in that directing someone's attention to something does not convey a specific meaning in the manner of most conventionalized, symbolic gestures. This paper proposes an online learning structure to model a correlation between pointing gesture and verbal cues in human-robot interaction.
引用
收藏
页码:103 / 108
页数:6
相关论文
共 50 条
  • [41] Computer vision-based hand gesture recognition for human-robot interaction: a review
    Jing Qi
    Li Ma
    Zhenchao Cui
    Yushu Yu
    Complex & Intelligent Systems, 2024, 10 : 1581 - 1606
  • [42] Intuitiveness Level: Frustration-Based Methodology for Human-Robot Interaction Gesture Elicitation
    Canuto, Clebeson
    Freire, Eduardo O.
    Molina, Lucas
    Carvalho, Elyson A. N.
    Givigi, Sidney N.
    IEEE ACCESS, 2022, 10 : 17145 - 17154
  • [43] Recognizing pointing behavior using image processing for human-robot interaction
    Sakurai, Shoichiro
    Sato, Eri
    Yamaguchi, Toru
    2007 IEEE/ASME INTERNATIONAL CONFERENCE ON ADVANCED INTELLIGENT MECHATRONICS, VOLS 1-3, 2007, : 762 - +
  • [44] Pointing Gestures for Human-Robot Interaction in Service Robotics: A Feasibility Study
    Pozzi, Luca
    Gandolla, Marta
    Roveda, Loris
    COMPUTERS HELPING PEOPLE WITH SPECIAL NEEDS, ICCHP-AAATE 2022, PT II, 2022, : 461 - 468
  • [45] Robot-Facilitated Human-Robot Interaction with Integrated Tracking, Re-identification and Gesture Recognition
    Lee, Sukhan
    Lee, Soojin
    Kim, Seunghwan
    Kim, Aruem
    INTELLIGENT AUTONOMOUS SYSTEMS 18, VOL 1, IAS18-2023, 2024, 795 : 257 - 275
  • [46] Integration of Tracking, Re-Identification, and Gesture Recognition for Facilitating Human-Robot Interaction
    Lee, Sukhan
    Lee, Soojin
    Park, Hyunwoo
    SENSORS, 2024, 24 (15)
  • [47] Daily Gesture Recognition During Human-Robot Interaction Combining Vision and Wearable Systems
    Fiorini, Laura
    Loizzo, Federica G. Cornacchia
    Sorrentino, Alessandra
    Kim, Jaeseok
    Rovini, Erika
    Di Nuovo, Alessandro
    Cavallo, Filippo
    IEEE SENSORS JOURNAL, 2021, 21 (20) : 23568 - 23577
  • [48] Foundations of Visual Linear Human–Robot Interaction via Pointing Gesture Navigation
    Michal Tölgyessy
    Martin Dekan
    František Duchoň
    Jozef Rodina
    Peter Hubinský
    L’uboš Chovanec
    International Journal of Social Robotics, 2017, 9 : 509 - 523
  • [49] RaCon: A gesture recognition approach via Doppler radar for intelligent human-robot interaction
    Zhang, Kaijie
    Yu, Zhiwen
    Zhang, Dong
    Wang, Zhu
    Guo, Bin
    2020 IEEE INTERNATIONAL CONFERENCE ON PERVASIVE COMPUTING AND COMMUNICATIONS WORKSHOPS (PERCOM WORKSHOPS), 2020,
  • [50] Human-Robot Interaction Through Gesture-Free Spoken Dialogue
    Vladimir Kulyukin
    Autonomous Robots, 2004, 16 : 239 - 257