Hand Gesture Recognition Across Various Limb Positions Using a Multimodal Sensing System Based on Self-Adaptive Data-Fusion and Convolutional Neural Networks (CNNs)

被引:13
作者
Zhang, Shen [1 ,2 ]
Zhou, Hao [1 ,2 ]
Tchantchane, Rayane [1 ,2 ]
Alici, Gursel [1 ,2 ]
机构
[1] Univ Wollongong, Sch Mech Mat Mechatron & Biomed Engn, Wollongong, NSW 2522, Australia
[2] Univ Wollongong, Adv Mechatron & Biomed Engn Res AMBER Grp, Wollongong, NSW 2522, Australia
关键词
Sensors; Gesture recognition; Classification algorithms; Feature extraction; Electromyography; Convolutional neural networks; Training; Data fusion; deep learning (DL); hand gesture recognition; human-machine interface (HMI); limb position effect; multimodal sensing; sensor fusion; TIME;
D O I
10.1109/JSEN.2024.3389963
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
This study explores the challenge of hand gesture recognition across various limb positions using a new co-located multimodal armband system incorporating surface electromyography (sEMG) and pressure-based force myography (pFMG) sensors. Conventional machine learning (ML) algorithms and convolutional neural network models (CNNs) were evaluated for accurately recognizing hand gestures. A comprehensive investigation was conducted, encompassing feature-level and decision-level CNN models, alongside advanced fusion techniques to enhance the recognition performance. This research consistently demonstrates the superiority of CNN models, revealing their potential in extracting intricate patterns from raw multimodal sensor data. The study showcased significant accuracy improvements over single-modality approaches, emphasizing the synergistic effects of multimodal sensing. Notably, the CNN models achieved and 88.34% accuracy for self-adaptive decision-level fusion and 87.79% accuracy for feature-level fusion, outperforming the linear discriminant analysis (LDA) with 83.33% accuracy when considering all nine gestures. Furthermore, the study explores the relationship between the number of hand gestures and recognition accuracy, revealing consistently high accuracy levels ranging from 88% to 100% for 2-9 gestures and a remarkable 98% accuracy for the commonly used five gestures. This research underscores the adaptability of CNNs in effectively capturing the complex complementation between multimodal data and varying limb positions, advancing the field of gesture recognition, and emphasizing the potential of high-level data-fusion deep learning (DL) techniques in wearable sensing systems. This study provides valuable contributions to how multimodal sensor/data fusion, coupled with advanced ML methods, enhances hand gesture recognition accuracy, ultimately paving the way for more effective and adaptable wearable technology applications.
引用
收藏
页码:18633 / 18645
页数:13
相关论文
共 47 条
[1]   Deep Learning with Convolutional Neural Networks Applied to Electromyography Data: A Resource for the Classification of Movements for Prosthetic Hands [J].
Atzori, Manfredo ;
Cognolato, Matteo ;
Mueller, Henning .
FRONTIERS IN NEUROROBOTICS, 2016, 10
[2]   Electromyography-Based Gesture Recognition: Is It Time to Change Focus From the Forearm to the Wrist? [J].
Botros, Fady S. ;
Phinyomark, Angkoon ;
Scheme, Erik J. .
IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2022, 18 (01) :174-184
[3]  
Briouza Sami, 2021, 2021 International Conference on Data Analytics for Business and Industry (ICDABI), P107, DOI 10.1109/ICDABI53623.2021.9655876
[4]   Current Trends and Confounding Factors in Myoelectric Control: Limb Position and Contraction Intensity [J].
Campbell, Evan ;
Phinyomark, Angkoon ;
Scheme, Erik .
SENSORS, 2020, 20 (06)
[5]   User-Independent Hand Gesture Recognition Classification Models Using Sensor Fusion [J].
Colli Alfaro, Jose Guillermo ;
Trejos, Ana Luisa .
SENSORS, 2022, 22 (04)
[6]   Interpreting Deep Learning Features for Myoelectric Control: A Comparison With Handcrafted Features [J].
Cote-Allard, Ulysse ;
Campbell, Evan ;
Phinyomark, Angkoon ;
Laviolette, Francois ;
Gosselin, Benoit ;
Scheme, Erik .
FRONTIERS IN BIOENGINEERING AND BIOTECHNOLOGY, 2020, 8
[7]   A robust, real-time control scheme for multifunction myoelectric control [J].
Englehart, K ;
Hudgins, B .
IEEE TRANSACTIONS ON BIOMEDICAL ENGINEERING, 2003, 50 (07) :848-854
[8]   EMG-Centered Multisensory Based Technologies for Pattern Recognition in Rehabilitation: State of the Art and Challenges [J].
Fang, Chaoming ;
He, Bowei ;
Wang, Yixuan ;
Cao, Jin ;
Gao, Shuo .
BIOSENSORS-BASEL, 2020, 10 (08)
[9]   Multi-Modal Sensing Techniques for Interfacing Hand Prostheses: A Review [J].
Fang, Yinfeng ;
Hettiarachchi, Nalinda ;
Zhou, Dalin ;
Liu, Honghai .
IEEE SENSORS JOURNAL, 2015, 15 (11) :6065-6076
[10]  
Fora Malak, 2023, 2023 IEEE Jordan International Joint Conference on Electrical Engineering and Information Technology (JEEIT), P22, DOI 10.1109/JEEIT58638.2023.10185697