Classification of Hand Postures Based on 3D Vision Model for Human-Robot Interaction

被引:0
|
作者
Takimoto, Hironori [1 ]
Yoshimori, Seiki [2 ]
Mitsukura, Yasue [3 ]
Fukumi, Minoru [4 ]
机构
[1] Okayama Prefectural Univ, Fac Comp Sci & Syst Engn Commun & Engn, 111 Kuboki, Okayama 7191197, Japan
[2] Nippon Bunri Univ, Fac Engn, Oita 8700397, Japan
[3] Tokyo Univ Agr & Technol, Grad Sch BioAppl & Syst Engn, Tokyo 1848588, Japan
[4] Univ Tokushima, Dept Informat & Sci Intelligent Syst, Tokushima 7708506, Japan
来源
2010 IEEE RO-MAN | 2010年
关键词
RECOGNITION;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, a method for hand posture recognition, which is robust for hand posture changing in an actual environment, is proposed. Conventionally, a data glove device and a 3D scanner have been used for the feature extraction of hand shape. However, the performance of each approach is affected by hand posture changing. Therefore, this paper proposes the posture fluctuation model for efficient hand posture recognition, based on 3D hand shape and color feature obtained from a stereo camera. A large set of dictionary for posture recognition is built by various leaned hand images which were auto-created from one scanned hand image, based on plural proposed models. In order to show the effectiveness of proposed method, performance and processing times for posture recognition are compared to conventional method. In addition, we perform the evaluation experiment by using the Japanese sign language.
引用
收藏
页码:292 / 297
页数:6
相关论文
共 50 条
  • [1] Regression-Based 3D Hand Pose Estimation for Human-Robot Interaction
    Bandi, Chaitanya
    Thomas, Ulrike
    COMPUTER VISION, IMAGING AND COMPUTER GRAPHICS THEORY AND APPLICATIONS, VISIGRAPP 2020, 2022, 1474 : 507 - 529
  • [2] Human-Robot Interaction through 3D Vision and Force Control
    Jevtic, Aleksandar
    Doisy, Guillaume
    Bodiroza, Sasa
    Edan, Yael
    Hafner, Verena V.
    HRI'14: PROCEEDINGS OF THE 2014 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION, 2014, : 102 - 102
  • [3] Dynamic Hand Gesture Recognition Based on 3D Hand Pose Estimation for Human-Robot Interaction
    Gao, Qing
    Chen, Yongquan
    Ju, Zhaojie
    Liang, Yi
    IEEE SENSORS JOURNAL, 2022, 22 (18) : 17421 - 17430
  • [4] Realtime 3D Segmentation for Human-Robot Interaction
    Ueckermann, Andre
    Haschke, Robert
    Ritter, Helge
    2013 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2013, : 2136 - 2143
  • [5] A Robot Navigation Method Based on Human-Robot Interaction for 3D Environment Mapping
    Zhao, Lijun
    Li, Xiaoyu
    Sun, Zhenye
    Wang, Ke
    Yang, Chenguang
    2017 IEEE INTERNATIONAL CONFERENCE ON REAL-TIME COMPUTING AND ROBOTICS (RCAR), 2017, : 409 - 414
  • [6] 3D Hand and Object Pose Estimation for Real-time Human-robot Interaction
    Bandi, Chaitanya
    Kisner, Hannes
    Thomas, Urike
    PROCEEDINGS OF THE 17TH INTERNATIONAL JOINT CONFERENCE ON COMPUTER VISION, IMAGING AND COMPUTER GRAPHICS THEORY AND APPLICATIONS (VISAPP), VOL 4, 2022, : 770 - 780
  • [7] A human-robot interaction system based on 3D mapping and virtual reality
    Zhang H.
    Wang P.
    Xiao J.-H.
    Lu H.-M.
    Kongzhi yu Juece/Control and Decision, 2018, 33 (11): : 1975 - 1982
  • [8] Human behavior and hand gesture classification for smart human-robot interaction
    Mendes, Nuno
    Ferrer, Joao
    Vitorino, Joao
    Safeea, Mohammad
    Neto, Pedro
    27TH INTERNATIONAL CONFERENCE ON FLEXIBLE AUTOMATION AND INTELLIGENT MANUFACTURING, FAIM2017, 2017, 11 : 91 - 98
  • [9] Computer vision-based hand gesture recognition for human-robot interaction: a review
    Qi, Jing
    Ma, Li
    Cui, Zhenchao
    Yu, Yushu
    COMPLEX & INTELLIGENT SYSTEMS, 2024, 10 (01) : 1581 - 1606
  • [10] Computer vision-based hand gesture recognition for human-robot interaction: a review
    Jing Qi
    Li Ma
    Zhenchao Cui
    Yushu Yu
    Complex & Intelligent Systems, 2024, 10 : 1581 - 1606