Vision Based Hand Gesture Recognition Using 3D Shape Context

被引:31
作者
Zhu, Chen [1 ]
Yang, Jianyu [1 ]
Shao, Zhanpeng [2 ]
Liu, Chunping [3 ]
机构
[1] Soochow Univ, Sch Rail Transportat, Suzhou 215131, Peoples R China
[2] Zhejiang Univ Technol, Coll Comp Sci & Technol, Hangzhou 310023, Peoples R China
[3] Soochow Univ, Sch Comp Sci & Technol, Suzhou 215006, Peoples R China
基金
中国国家自然科学基金;
关键词
3D shape context; depth map; hand shape segmentation; hand gesture recognition; human-computer interaction; OBJECT RECOGNITION; VISUAL TRACKING;
D O I
10.1109/JAS.2019.1911534
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Hand gesture recognition is a popular topic in computer vision and makes human-computer interaction more flexible and convenient. The representation of hand gestures is critical for recognition. In this paper, we propose a new method to measure the similarity between hand gestures and exploit it for hand gesture recognition. The depth maps of hand gestures captured via the Kinect sensors are used in our method, where the 3D hand shapes can be segmented from the cluttered backgrounds. To extract the pattern of salient 3D shape features, we propose a new descriptor-3D Shape Context, for 3D hand gesture representation. The 3D Shape Context information of each 3D point is obtained in multiple scales because both local shape context and global shape distribution are necessary for recognition. The description of all the 3D points constructs the hand gesture representation, and hand gesture recognition is explored via dynamic time warping algorithm. Extensive experiments are conducted on multiple benchmark datasets. The experimental results verify that the proposed method is robust to noise, articulated variations, and rigid transformations. Our method outperforms state-of-the-art methods in the comparisons of accuracy and efficiency.
引用
收藏
页码:1600 / 1613
页数:14
相关论文
共 50 条
  • [21] Static hand gesture recognition method based on the Vision Transformer
    Yu Zhang
    Junlin Wang
    Xin Wang
    Haonan Jing
    Zhanshuo Sun
    Yu Cai
    Multimedia Tools and Applications, 2023, 82 : 31309 - 31328
  • [22] Real-Time Hand Gesture Recognition Based on Vision
    Ren, Yu
    Gu, Chengcheng
    ENTERTAINMENT FOR EDUCATION: DIGITAL TECHNIQUES AND SYSTEMS, 2010, 6249 : 468 - 475
  • [23] Static hand gesture recognition method based on the Vision Transformer
    Zhang, Yu
    Wang, Junlin
    Wang, Xin
    Jing, Haonan
    Sun, Zhanshuo
    Cai, Yu
    MULTIMEDIA TOOLS AND APPLICATIONS, 2023, 82 (20) : 31309 - 31328
  • [24] Vision Based Hand Gesture Recognition for Mobile Devices: A Review
    Lahiani, Houssem
    Kherallah, Monji
    Neji, Mahmoud
    PROCEEDINGS OF THE 16TH INTERNATIONAL CONFERENCE ON HYBRID INTELLIGENT SYSTEMS (HIS 2016), 2017, 552 : 308 - 318
  • [25] Dynamic hand gesture recognition using vision-based approach for human-computer interaction
    Singha, Joyeeta
    Roy, Amarjit
    Laskar, Rabul Hussain
    NEURAL COMPUTING & APPLICATIONS, 2018, 29 (04) : 1129 - 1141
  • [26] Architecture Design and VLSI Implementation of 3D Hand Gesture Recognition System
    Tsai, Tsung-Han
    Tsai, Yih-Ru
    SENSORS, 2021, 21 (20)
  • [27] 3D separable convolutional neural network for dynamic hand gesture recognition
    Hu, Zhongxu
    Hu, Youmin
    Liu, Jie
    Wu, Bo
    Han, Dongmin
    Kurfess, Thomas
    NEUROCOMPUTING, 2018, 318 : 151 - 161
  • [28] Teaching a Robot Sign Language using Vision-Based Hand Gesture Recognition
    Zhi, Da
    de Oliveira, Thiago E. Alves
    da Fonseca, Vinicius Prado
    Petriu, Emil M.
    2018 IEEE INTERNATIONAL CONFERENCE ON COMPUTATIONAL INTELLIGENCE AND VIRTUAL ENVIRONMENTS FOR MEASUREMENT SYSTEMS AND APPLICATIONS (CIVEMSA), 2018,
  • [29] Context-based hand gesture recognition for the operating room
    Jacob, Mithun George
    Wachs, Juan Pablo
    PATTERN RECOGNITION LETTERS, 2014, 36 : 196 - 203
  • [30] Dynamic Hand Gesture Recognition Using Multi-direction 3D Convolutional Neural Networks
    Li, Jie
    Yang, Mingqiang
    Liu, Yupeng
    Wang, Yanyan
    Zheng, Qinghe
    Wang, Deqiang
    ENGINEERING LETTERS, 2019, 27 (03) : 490 - 500