An Integrative Framework of Human Hand Gesture Segmentation for Human-Robot Interaction

被引:40
作者
Ju, Zhaojie [1 ]
Ji, Xiaofei [2 ]
Li, Jing [3 ,4 ]
Liu, Honghai [1 ]
机构
[1] Univ Portsmouth, Sch Comp, Portsmouth PO1 2UP, Hants, England
[2] Shenyang Aerosp Univ, Sch Automat, Shenyang 110136, Liaoning, Peoples R China
[3] Nanchang Univ, Sch Informat Engn, Nanchang 330047, Jiangxi, Peoples R China
[4] Nanchang Univ, Jiangxi Prov Key Lab Intelligent Informat Syst, Nanchang 330047, Jiangxi, Peoples R China
来源
IEEE SYSTEMS JOURNAL | 2017年 / 11卷 / 03期
基金
中国国家自然科学基金; 英国工程与自然科学研究理事会;
关键词
Alignment; hand gesture segmentation; human-computer interaction (HCI); RGB-depth (RGB-D); CAMERA CALIBRATION; KINECT SENSOR; RECOGNITION; DEPTH;
D O I
10.1109/JSYST.2015.2468231
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This paper proposes a novel framework to segment hand gestures in RGB-depth (RGB-D) images captured by Kinect using humanlike approaches for human-robot interaction. The goal is to reduce the error of Kinect sensing and, consequently, to improve the precision of hand gesture segmentation for robot NAO. The proposed framework consists of two main novel approaches. First, the depth map and RGB image are aligned by using the genetic algorithm to estimate key points, and the alignment is robust to uncertainties of the extracted point numbers. Then, a novel approach is proposed to refine the edge of the tracked hand gestures in RGB images by applying a modified expectation-maximization (EM) algorithm based on Bayesian networks. The experimental results demonstrate that the proposed alignment method is capable of precisely matching the depth maps with RGB images, and the EM algorithm further effectively adjusts the RGB edges of the segmented hand gestures. The proposed framework has been integrated and validated in a system of human-robot interaction to improve NAO robot's performance of understanding and interpretation.
引用
收藏
页码:1326 / 1336
页数:11
相关论文
共 50 条
  • [11] Pantomimic Gestures for Human-Robot Interaction
    Burke, Michael
    Lasenby, Joan
    IEEE TRANSACTIONS ON ROBOTICS, 2015, 31 (05) : 1225 - 1237
  • [12] Emotion Analysis in Human-Robot Interaction
    Szaboova, Martina
    Sarnovsky, Martin
    Maslej Kresnakova, Viera
    Machova, Kristina
    ELECTRONICS, 2020, 9 (11) : 1 - 31
  • [13] A Review of Emotions in Human-Robot Interaction
    Cordeiro Ottoni, Lara Toledo
    Fiais Cerqueira, Jes de Jesus
    2021 LATIN AMERICAN ROBOTICS SYMPOSIUM / 2021 BRAZILIAN SYMPOSIUM ON ROBOTICS / 2021 WORKSHOP OF ROBOTICS IN EDUCATION (LARS-SBR-WRE 2021), 2021, : 7 - 12
  • [14] Online Robot Teaching With Natural Human-Robot Interaction
    Du, Guanglong
    Chen, Mingxuan
    Liu, Caibing
    Zhang, Bo
    Zhang, Ping
    IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS, 2018, 65 (12) : 9571 - 9581
  • [15] Motor Contagion during Human-Human and Human-Robot Interaction
    Bisio, Ambra
    Sciutti, Alessandra
    Nori, Francesco
    Metta, Giorgio
    Fadiga, Luciano
    Sandini, Giulio
    Pozzo, Thierry
    PLOS ONE, 2014, 9 (08):
  • [16] Neurocognitive features of human-robot and human-machine interaction
    Bossi, Francesco
    Ciardo, Francesca
    Mostafaoui, Ghiles
    FRONTIERS IN PSYCHOLOGY, 2024, 15
  • [17] Body Language in Affective Human-Robot Interaction
    Stoeva, Darja
    Gelautz, Margrit
    HRI'20: COMPANION OF THE 2020 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION, 2020, : 606 - 608
  • [18] A Framework of Real Time Hand Gesture Vision Based Human-Computer Interaction
    Sha, Liang
    Wang, Guijin
    Lin, Xinggang
    Wang, Kongqiao
    IEICE TRANSACTIONS ON FUNDAMENTALS OF ELECTRONICS COMMUNICATIONS AND COMPUTER SCIENCES, 2011, E94A (03) : 979 - 989
  • [19] A Taxonomy of Social Errors in Human-Robot Interaction
    Tian, Leimin
    Oviatt, Sharon
    ACM TRANSACTIONS ON HUMAN-ROBOT INTERACTION, 2021, 10 (02)
  • [20] Fuzzy visual detection for human-robot interaction
    Shieh, Ming-Yuan
    Hsieh, Chung-Yu
    Hsieh, Tsung-Min
    ENGINEERING COMPUTATIONS, 2014, 31 (08) : 1709 - 1719