Non-trajectory-based gesture recognition in human-computer interaction based on hand skeleton data

被引:9
作者
Jia, Lesong [1 ]
Zhou, Xiaozhou [1 ]
Xue, Chengqi [1 ]
机构
[1] Southeast Univ, Sch Mech Engn, Nanjing 211189, Peoples R China
基金
中国国家自然科学基金;
关键词
Gesture recognition; Skeleton data; HMM; HCI; INTERFACES; FEATURES;
D O I
10.1007/s11042-022-12355-8
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Currently, no efficient, accurate and flexible gesture recognition algorithm has been developed to recognize non-trajectory-based gesture recognition. Therefore, we aim to construct a gesture recognition algorithm to not only complete gesture recognition accurately and quickly but also adapt to individual differences. In this paper, we present a novel non-trajectory-based gesture recognition method (NT-GRM) based on hand skeleton information and a hidden Markov model (HMM). To recognize a static gesture, the direction information of each bone section of the hands was taken as the observation data to construct the HMM. In addition, multiple static gestures were detected in turn to identify a dynamic gesture. As determined by experimental verification, the NT-GRM can complete recognition in a system containing ten interactive gestures with a recognition accuracy of over 95% and a recognition speed of 21.73 ms. The training time required for each static gesture model is 2.56 s. And the NT-GRM can identify static and dynamic gestures accurately and quickly with small training samples in different functional modes. In conclusion, the NT-GRM can be applied to the development of gesture interaction systems to help developers realize practical functions such as gesture library construction, user gesture customization, and user gesture adaptation.
引用
收藏
页码:20509 / 20539
页数:31
相关论文
共 50 条
  • [31] Vision Based Hand Gesture Recognition
    Zhu, Yanmin
    Yang, Zhibo
    Yuan, Bo
    2013 INTERNATIONAL CONFERENCE ON SERVICE SCIENCES (ICSS 2013), 2013, : 260 - 265
  • [32] Continuous Gesture Trajectory Recognition System Based on Computer Vision
    Wenkai, Xu
    Lee, Eung-Joo
    APPLIED MATHEMATICS & INFORMATION SCIENCES, 2012, 6 (02): : 339S - 346S
  • [33] Accelerometer-based Hand Gesture Recognition for Human-Robot Interaction
    Anderez, Dario Ortega
    Dos Santos, Luis Pedro
    Lotfi, Ahmad
    Yahaya, Salisu Wada
    2019 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (IEEE SSCI 2019), 2019, : 1402 - 1406
  • [34] Hidden Markov model for human to computer interaction: a study on human hand gesture recognition
    Sara Bilal
    Rini Akmeliawati
    Amir A. Shafie
    Momoh Jimoh E. Salami
    Artificial Intelligence Review, 2013, 40 : 495 - 516
  • [35] Hidden Markov model for human to computer interaction: a study on human hand gesture recognition
    Bilal, Sara
    Akmeliawati, Rini
    Shafie, Amir A.
    Salami, Momoh Jimoh E.
    ARTIFICIAL INTELLIGENCE REVIEW, 2013, 40 (04) : 495 - 516
  • [36] Research on Static Hand Gesture Recognition Technology for Human Computer Interaction System
    Yang, Fuchang
    Shi, Zhijian
    2016 INTERNATIONAL CONFERENCE ON INTELLIGENT TRANSPORTATION, BIG DATA & SMART CITY (ICITBS), 2017, : 459 - 463
  • [37] A User-Developed 3-D Hand Gesture Set for Human-Computer Interaction
    Pereira, Anna
    Wachs, Juan P.
    Park, Kunwoo
    Rempel, David
    HUMAN FACTORS, 2015, 57 (04) : 607 - 621
  • [38] HAND TRAJECTORY-BASED GESTURE SPOTTING AND RECOGNITION USING HMM
    Elmezain, Mahmoud
    Al-Hamadi, Ayoub
    Michaelis, Bernd
    2009 16TH IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, VOLS 1-6, 2009, : 3577 - 3580
  • [39] A Survey of 2D and 3D Imaging Used in Hand Gesture Recognition for Human-Computer Interaction (HCI)
    Itkarkar, Rajeshri R.
    Nandi, Anilkumar, V
    2016 IEEE INTERNATIONAL WIE CONFERENCE ON ELECTRICAL AND COMPUTER ENGINEERING (IEEE WIECON-ECE 2016), 2016, : 188 - 193
  • [40] Gesture Recognition from Skeleton Data for Intuitive Human-Machine Interaction
    Bras, Andre
    Simao, Miguel
    Neto, Pedro
    TRANSDISCIPLINARY ENGINEERING METHODS FOR SOCIAL INNOVATION OF INDUSTRY 4.0, 2018, 7 : 271 - 280