A Chinese sign language recognition system combining attention mechanism and acoustic sensing

被引:0
作者
Shi, Yuepeng [1 ]
Wu, Yansheng [1 ]
Li, Qian [1 ]
Zhang, Junyi [1 ]
机构
[1] School of Energy and Intelligent Engineering, Henan University of Animal Husbandry and Economy, Zhengzhou
来源
MCB Molecular and Cellular Biomechanics | 2024年 / 21卷 / 04期
关键词
acoustic sensing; attention mechanism; channel impulse response; sign language gesture recognition;
D O I
10.62617/mcb793
中图分类号
学科分类号
摘要
In recent years, with the widespread popularity of smart devices and the rapid development of communication and artificial intelligence technologies, sign language gestures that can break the communication barriers between ordinary people and those with speech and hearing impairments have received much attention. However, existing human gesture recognition methods include wearable device-based, computer vision-based and Radio Frequency (RF) signal-based. These methods have problems of being difficult to deploy, violating user privacy, and being susceptible to ambient light. Compared with the above methods, using ultrasonic signals to sense sign language gestures has the advantages of not violating user privacy and not being affected by ambient light. For that purpose, we use the built-in speaker and microphone of a smartphone to send and receive ultrasonic signals to recognize sign language gestures. In order to recognize fine-grained sign language gestures, we calculate the Channel Impulse Response (CIR) induced by the sign language action as a sign language gesture special. After that, we compute first-order differences along the time dimension of the Channel Impulse Response matrix to eliminate static path interference. Finally, a convolutional neural network containing convolutional layers, spatial attention, and channel attention is passed in order to recognize sign language gestures. The experimental results show that the scheme has a recognition accuracy of 95.2% for 12 sign language interaction gestures. Copyright © 2024 by author(s).
引用
收藏
相关论文
共 25 条
[1]  
Deafness and hearing loss
[2]  
Piskozub J, Strumillo P., Reducing the number of sensors in the data glove for recognition of static hand gestures, Applied Sciences, 12, 15, (2022)
[3]  
Zheng Z, Wang Q, Yang D, Et al., L-sign: Large-vocabulary sign gestures recognition system, IEEE Transactions on Human-Machine Systems, 52, 2, pp. 290-301, (2022)
[4]  
Zhang Q, Jing J Z, Wang D, Et al., Wearsign: Pushing the limit of sign language translation using inertial and emg wearables, Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, 6, 1, pp. 1-27, (2022)
[5]  
SS P D, Patil S B, Patil B S., Gesture Recognition Machine Vision Video Calling Application Using YOLOv8, 2023 22nd International Symposium on Communications and Information Technologies (ISCIT), pp. 105-109, (2023)
[6]  
Zhu C, Yang J, Shao Z, Et al., Vision based hand gesture recognition using 3D shape context, IEEE/CAA Journal of Automatica Sinica, 8, 9, pp. 1600-1613, (2019)
[7]  
Saboo S, Singha J., Vision based two-level hand tracking system for dynamic hand gestures in indoor environment, Multimedia Tools and Applications, 80, 13, pp. 20579-20598, (2021)
[8]  
Dian C, Wang D, Zhang Q, Et al., Towards domain-independent complex and fine-grained gesture recognition with RFID, Proceedings of the ACM on Human-Computer Interaction, 4, pp. 1-22, (2020)
[9]  
Cai X, Ma J, Liu W, Et al., Efficient convolutional neural network for fmcw radar based hand gesture recognition, Adjunct Proceedings of the 2019 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2019 ACM International Symposium on Wearable Computers, pp. 17-20, (2019)
[10]  
Liu H, Cui K, Hu K, Et al., MTransSee: Enabling environment-independent mmWave sensing based gesture recognition via transfer learning, Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, 6, 1, pp. 1-28, (2022)