Dynamic gesture recognition using hand pose-based neural networks for sign language interpretation

被引:1
作者
Sharma, Vaidehi [1 ]
Sood, Nehil [1 ]
Jaiswal, Mohita [1 ]
Sharma, Abhishek [1 ]
Saini, Sandeep [1 ]
Chang, Jieh-Ren [2 ]
机构
[1] LNMIIT, Elect & Commun Engn, Jaipur, Rajasthan, India
[2] Natl Ilan Univ, Dept Elect Engn, Yilan, Taiwan
关键词
Sign language recognition; Indian sign language; Hand gesture recognition; Long short-term memory networks;
D O I
10.1007/s10209-024-01162-7
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
The dynamic two-hand Sign Language Recognition system is effective for individuals with speech and hearing impairments, and it also plays a crucial role in computer vision research. Additionally, real-time recognition of such gestures presents a significant challenge as the machine must accurately identify combinations of patterns and interpret them within specified time constraints. The current vision-based systems designed for dynamic gesture recognition are deemed ineffective as a result of an excessive amount of feature extraction from lengthy sequences of frames. Furthermore, current systems are exclusively designed for real-time static gesture recognition. Therefore, this paper proposes a vision-based dynamic sign language recognition system using Time Distributed Long Short-Term Memory Networks that addresses the deficiencies of current systems. The pipeline for creating the corpus for the proposed Dynamic Two Hands Indian Sign Language dataset is also worked out. The proposed model utilizes a keypoints-based feature extraction technique to enhance the accuracy and efficiency of the system. The experimental findings indicate that the developed system has a recognition rate of 99.6%surpassing other current state-of-the-art methods.
引用
收藏
页码:1673 / 1685
页数:13
相关论文
共 32 条
[1]   Intelligent real-time Arabic sign language classification using attention-based inception and BiLSTM [J].
Abdul, Wadood ;
Alsulaiman, Mansour ;
Amin, Syed Umar ;
Faisal, Mohammed ;
Muhammad, Ghulam ;
Albogamy, Fahad R. ;
Bencherif, Mohamed A. ;
Ghaleb, Hamid .
COMPUTERS & ELECTRICAL ENGINEERING, 2021, 95
[2]   Hand Gestures Recognition Using Radar Sensors for Human-Computer-Interaction: A Review [J].
Ahmed, Shahzad ;
Kallu, Karam Dad ;
Ahmed, Sarfaraz ;
Cho, Sung Ho .
REMOTE SENSING, 2021, 13 (03) :1-24
[3]   Arabic Sign Language Recognition System Using 2D Hands and Body Skeleton Data [J].
Bencherif, Mohamed A. ;
Algabri, Mohammed ;
Mekhtiche, Mohamed A. ;
Faisal, Mohammed ;
Alsulaiman, Mansour ;
Mathkour, Hassan ;
Al-Hammadi, Muneer ;
Ghaleb, Hamid .
IEEE ACCESS, 2021, 9 :59612-59627
[4]   Sign Pose-based Transformer for Word-level Sign Language Recognition [J].
Bohacek, Matyas ;
Hruz, Marek .
2022 IEEE/CVF WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION WORKSHOPS (WACVW 2022), 2022, :182-191
[5]  
Chen Yen-Chun., 2019, CoRR abs/1908.04942
[6]   A hybrid approach for Bangla sign language recognition using deep transfer learning model with random forest classifier [J].
Das, Sunanda ;
Imtiaz, Md. Samir ;
Neom, Nieb Hasan ;
Siddique, Nazmul ;
Wang, Hui .
EXPERT SYSTEMS WITH APPLICATIONS, 2023, 213
[7]   Design and Implementation of Deep Learning Based Contactless Authentication System Using Hand Gestures [J].
Dayal, Aveen ;
Paluru, Naveen ;
Cenkeramaddi, Linga Reddy ;
Soumya, J. ;
Yalavarthy, Phaneendra K. .
ELECTRONICS, 2021, 10 (02) :1-15
[8]  
Elakkiya R., 2021, MENDELEY DATA
[9]   Multimodal Learning for Sign Language Recognition [J].
Ferreira, Pedro M. ;
Cardoso, Jaime S. ;
Rebelo, Ana .
PATTERN RECOGNITION AND IMAGE ANALYSIS (IBPRIA 2017), 2017, 10255 :313-321
[10]  
Gandhi J., 2021, 2021 2 INT C EM TECH