Classification of Hand Postures Based on 3D Vision Model for Human-Robot Interaction

被引:0
作者
Takimoto, Hironori [1 ]
Yoshimori, Seiki [2 ]
Mitsukura, Yasue [3 ]
Fukumi, Minoru [4 ]
机构
[1] Okayama Prefectural Univ, Fac Comp Sci & Syst Engn Commun & Engn, 111 Kuboki, Okayama 7191197, Japan
[2] Nippon Bunri Univ, Fac Engn, Oita 8700397, Japan
[3] Tokyo Univ Agr & Technol, Grad Sch BioAppl & Syst Engn, Tokyo 1848588, Japan
[4] Univ Tokushima, Dept Informat & Sci Intelligent Syst, Tokushima 7708506, Japan
来源
2010 IEEE RO-MAN | 2010年
关键词
RECOGNITION;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, a method for hand posture recognition, which is robust for hand posture changing in an actual environment, is proposed. Conventionally, a data glove device and a 3D scanner have been used for the feature extraction of hand shape. However, the performance of each approach is affected by hand posture changing. Therefore, this paper proposes the posture fluctuation model for efficient hand posture recognition, based on 3D hand shape and color feature obtained from a stereo camera. A large set of dictionary for posture recognition is built by various leaned hand images which were auto-created from one scanned hand image, based on plural proposed models. In order to show the effectiveness of proposed method, performance and processing times for posture recognition are compared to conventional method. In addition, we perform the evaluation experiment by using the Japanese sign language.
引用
收藏
页码:292 / 297
页数:6
相关论文
共 50 条
  • [21] Personality affected robotic emotional model with associative memory for human-robot interaction
    Masuyama, Naoki
    Loo, Chu Kiong
    Seera, Manjeevan
    [J]. NEUROCOMPUTING, 2018, 272 : 213 - 225
  • [22] Adaptive Trajectory Planning of Lower Limb Rehabilitation Robot Based on EMG and Human-Robot Interaction
    Xie, Ping
    Qiu, Shi
    Li, Xinxin
    Du, Yihao
    Wu, Xiaoguang
    Guo, Zihui
    [J]. 2016 IEEE INTERNATIONAL CONFERENCE ON INFORMATION AND AUTOMATION (ICIA), 2016, : 1273 - 1277
  • [23] Towards Real-time Physical Human-Robot Interaction using Skeleton Information and Hand Gestures
    Mazhar, Osama
    Ramdani, Sofiane
    Navarro, Benjamin
    Passama, Robin
    Cherubini, Andrea
    [J]. 2018 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2018, : 7336 - 7341
  • [24] Recognizing Conversational Interaction Based on 3D Human Pose
    Deng, Jingjing
    Xie, Xianghua
    Daubney, Ben
    Fang, Hui
    Grant, Phil W.
    [J]. ADVANCED CONCEPTS FOR INTELLIGENT VISION SYSTEMS, ACIVS 2013, 2013, 8192 : 138 - 149
  • [25] A real-time human-robot interaction framework with robust background invariant hand gesture detection
    Mazhar, Osama
    Navarro, Benjamin
    Ramdani, Sofiane
    Passama, Robin
    Cherubini, Andrea
    [J]. ROBOTICS AND COMPUTER-INTEGRATED MANUFACTURING, 2019, 60 : 34 - 48
  • [26] Intention Understanding in Human-Robot Interaction Based on Visual-NLP Semantics
    Li, Zhihao
    Mu, Yishan
    Sun, Zhenglong
    Song, Sifan
    Su, Jionglong
    Zhang, Jiaming
    [J]. FRONTIERS IN NEUROROBOTICS, 2021, 14
  • [27] Human Computer Interaction for Vision Based Hand Gesture Recognition : A Survey
    Hasan, Haitham
    Kareem, S. Abdul
    [J]. 2012 INTERNATIONAL CONFERENCE ON ADVANCED COMPUTER SCIENCE APPLICATIONS AND TECHNOLOGIES (ACSAT), 2012, : 55 - 60
  • [28] Vision based hand gesture recognition for human computer interaction: a survey
    Rautaray, Siddharth S.
    Agrawal, Anupam
    [J]. ARTIFICIAL INTELLIGENCE REVIEW, 2015, 43 (01) : 1 - 54
  • [29] Vision based hand gesture recognition for human computer interaction: a survey
    Siddharth S. Rautaray
    Anupam Agrawal
    [J]. Artificial Intelligence Review, 2015, 43 : 1 - 54
  • [30] Robust Stereo-Vision Based 3D Object Reconstruction for the Assistive Robot FRIEND
    Ristic-Durrant, Danijela
    Grigorescu, Sorin M.
    Graeser, Axel
    Cojbasic, Zarko
    Nikolic, Vlastimir
    [J]. ADVANCES IN ELECTRICAL AND COMPUTER ENGINEERING, 2011, 11 (04) : 15 - 22