STCNet: Spatio-Temporal Cross Network with subject-aware contrastive learning for hand gesture recognition in surface EMG

被引:1
作者
Yang, Jaemo [1 ]
Cha, Doheun [2 ]
Lee, Dong-Gyu [3 ]
Ahn, Sangtae [2 ]
机构
[1] School of Electronics Engineering, Kyungpook National University, Daegu
[2] School of Electronic and Electrical Engineering, Kyungpook National University, Daegu
[3] Department of Artificial Intelligence, Kyungpook National University, Daegu
基金
新加坡国家研究基金会;
关键词
Contrastive learning; Convolutional neural networks; Hand gesture recognition; Subject awareness; Surface electromyography;
D O I
10.1016/j.compbiomed.2024.109525
中图分类号
学科分类号
摘要
This paper introduces the Spatio-Temporal Cross Network (STCNet), a novel deep learning architecture tailored for robust hand gesture recognition in surface electromyography (sEMG) across multiple subjects. We address the challenges associated with the inter-subject variability and environmental factors such as electrode shift and muscle fatigue, which traditionally undermine the robustness of gesture recognition systems. STCNet integrates a convolutional-recurrent architecture with a spatio-temporal block that extracts features over segmented time intervals, enhancing both spatial and temporal analysis. Additionally, a rolling convolution technique designed to reflect the circular band structure of the sEMG measurement device is incorporated, thus capturing the inherent spatial relationships more effectively. We further propose a subject-aware contrastive learning framework that utilizes both subject and gesture label information to align the representation of vector space. Our comprehensive experimental evaluations demonstrate the superiority of STCNet under aggregated conditions, achieving state-of-the-art performance on benchmark datasets and effectively managing the variability among different subjects. The implemented code can be found at https://github.com/KNU-BrainAI/STCNet. © 2024 Elsevier Ltd
引用
收藏
相关论文
共 44 条
[1]  
Lu Z., Chen X., Li Q., Zhang X., Zhou P., A hand gesture recognition framework and wearable gesture-based interaction prototype for mobile devices, IEEE Trans. Hum.-Mach. Syst., 44, 2, pp. 293-299, (2014)
[2]  
Wen R., Tay W.-L., Nguyen B.P., Chng C.-B., Chui C.-K., Hand gesture guided robot-assisted surgery based on a direct augmented reality interface, Comput. Methods Programs Biomed., 116, 2, pp. 68-80, (2014)
[3]  
Ding Q., Li Z., Zhao X., Xiao Y., Han J., Real-time myoelectric prosthetic-hand control to reject outlier motion interference using one-class classifier, pp. 96-101, (2017)
[4]  
Karnam N.K., Dubey S.R., Turlapaty A.C., Gokaraju B., Emghandnet: A hybrid CNN and bi-LSTM architecture for hand activity classification using surface EMG signals, Biocybern. Biomed. Eng., 42, pp. 325-340, (2022)
[5]  
Olsson A.E., Bjorkman A., Antfolk C., Automatic discovery of resource-restricted convolutional neural network topologies for myoelectric pattern recognition, Comput. Biol. Med., 120, (2020)
[6]  
Chen X., Li Y., Hu R., Zhang X., Chen X., Hand gesture recognition based on surface electromyography using convolutional neural network with transfer learning method, IEEE J. Biomed. Health Inf., 25, pp. 1292-1304, (2021)
[7]  
Deng J., Dong W., Socher R., Li L.J., Li K., Fei-Fei L., Imagenet: A large-scale hierarchical image database, pp. 248-255, (2009)
[8]  
Khosla P., Teterwak P., Wang C., Sarna A., Tian Y., Isola P., Maschinot A., Liu C., Krishnan D., Supervised contrastive learning, Adv. Neural Inf. Process. Syst., 33, pp. 18661-18673, (2020)
[9]  
Atzori M., Muller H., The Ninapro database: A resource for sEMG naturally controlled robotic hand prosthetics, pp. 7151-7154, (2015)
[10]  
Pizzolato S., Tagliapietra L., Cognolato M., Reggiani M., Muller H., Atzori M., Comparison of six electromyography acquisition setups on hand movement classification tasks, PLoS One, 12, (2017)