Classification of Hand Postures Based on 3D Vision Model for Human-Robot Interaction

被引:0
|
作者
Takimoto, Hironori [1 ]
Yoshimori, Seiki [2 ]
Mitsukura, Yasue [3 ]
Fukumi, Minoru [4 ]
机构
[1] Okayama Prefectural Univ, Fac Comp Sci & Syst Engn Commun & Engn, 111 Kuboki, Okayama 7191197, Japan
[2] Nippon Bunri Univ, Fac Engn, Oita 8700397, Japan
[3] Tokyo Univ Agr & Technol, Grad Sch BioAppl & Syst Engn, Tokyo 1848588, Japan
[4] Univ Tokushima, Dept Informat & Sci Intelligent Syst, Tokushima 7708506, Japan
来源
2010 IEEE RO-MAN | 2010年
关键词
RECOGNITION;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, a method for hand posture recognition, which is robust for hand posture changing in an actual environment, is proposed. Conventionally, a data glove device and a 3D scanner have been used for the feature extraction of hand shape. However, the performance of each approach is affected by hand posture changing. Therefore, this paper proposes the posture fluctuation model for efficient hand posture recognition, based on 3D hand shape and color feature obtained from a stereo camera. A large set of dictionary for posture recognition is built by various leaned hand images which were auto-created from one scanned hand image, based on plural proposed models. In order to show the effectiveness of proposed method, performance and processing times for posture recognition are compared to conventional method. In addition, we perform the evaluation experiment by using the Japanese sign language.
引用
收藏
页码:292 / 297
页数:6
相关论文
共 50 条
  • [31] Biologically Inspired Vision for Human-Robot Interaction
    Saleiro, Mario
    Farrajota, Miguel
    Terzic, Kasim
    Krishna, Sai
    Rodrigues, Joao M. F.
    Hans du Buf, J. M.
    UNIVERSAL ACCESS IN HUMAN-COMPUTER INTERACTION: ACCESS TO INTERACTION, PT II, 2015, 9176 : 505 - 517
  • [32] Real-time Gender Recognition Based on 3D Human Body Shape for Human-Robot Interaction
    Luo, Ren C.
    Wu, Xiehao
    HRI'14: PROCEEDINGS OF THE 2014 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION, 2014, : 236 - 237
  • [33] Hand posture recognition in gesture-based human-robot interaction
    Yin, Xiaoming
    Zhu, Xing
    ICIEA 2006: 1ST IEEE CONFERENCE ON INDUSTRIAL ELECTRONICS AND APPLICATIONS, VOLS 1-3, PROCEEDINGS, 2006, : 397 - 402
  • [34] Vision-Based Hand Gesture Recognition for Human-Robot Collaboration: A Survey
    Xia, Zanwu
    Lei, Qujiang
    Yang, Yang
    Zhang, Hongda
    He, Yue
    Wang, Weijun
    Huang, Minghui
    CONFERENCE PROCEEDINGS OF 2019 5TH INTERNATIONAL CONFERENCE ON CONTROL, AUTOMATION AND ROBOTICS (ICCAR), 2019, : 198 - 205
  • [35] An eye-in-Hand Deice for Hand Gesture-Based Human-Robot Interaction
    Park, Hyun
    Nho, Seung-Min
    Kim, Heon-Hui
    Park, Kwang-Hyun
    2012 9TH INTERNATIONAL CONFERENCE ON UBIQUITOUS ROBOTS AND AMBIENT INTELLIGENCE (URAL), 2012, : 106 - 107
  • [36] Accelerometer-based Hand Gesture Recognition for Human-Robot Interaction
    Anderez, Dario Ortega
    Dos Santos, Luis Pedro
    Lotfi, Ahmad
    Yahaya, Salisu Wada
    2019 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (IEEE SSCI 2019), 2019, : 1402 - 1406
  • [37] Hand posture recognition in gesture-based human-robot interaction
    Yin, Xiaoming
    Zhu, Xing
    2006 1ST IEEE CONFERENCE ON INDUSTRIAL ELECTRONICS AND APPLICATIONS, VOLS 1-3, 2006, : 835 - +
  • [38] Approach to hand posture recognition based on hand shape features for human-robot interaction
    Qi, Jing
    Xu, Kun
    Ding, Xilun
    COMPLEX & INTELLIGENT SYSTEMS, 2022, 8 (04) : 2825 - 2842
  • [39] A Vision-based Gesture Recognition System for Human-Robot Interaction
    Zhang, Jianjie
    Zhao, Mingguo
    2009 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND BIOMIMETICS (ROBIO 2009), VOLS 1-4, 2009, : 2096 - 2101
  • [40] Motion Planning for Human-Robot Interaction Based on Stereo Vision and SIFT
    Liu, Hong
    Thou, Jie
    2009 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN AND CYBERNETICS (SMC 2009), VOLS 1-9, 2009, : 830 - 834