Continuous Arabic Sign Language Recognition Models

被引:1
作者
Algethami, Nahlah [1 ]
Farhud, Raghad [1 ]
Alghamdi, Manal [1 ]
Almutairi, Huda [1 ]
Sorani, Maha [1 ]
Aleisa, Noura [2 ]
机构
[1] Saudi Elect Univ, Coll Comp & Informat, Comp Sci Dept, Riyadh 11673, Saudi Arabia
[2] Saudi Elect Univ, Coll Comp & Informat, Informat Technol Dept, Riyadh 11673, Saudi Arabia
关键词
Arabic Sign Language; TCN; BiLSTM; MediaPipe; sign language recognition;
D O I
10.3390/s25092916
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
A significant communication gap persists between the deaf and hearing communities, often leaving deaf individuals isolated and marginalised. This challenge is especially pronounced for Arabic-speaking individuals, given the lack of publicly available Arabic Sign Language datasets and dedicated recognition systems. This study is the first to use the Temporal Convolutional Network (TCN) model for Arabic Sign Language (ArSL) recognition. We created a custom dataset of the 30 most common sentences in ArSL. We improved recognition performance by enhancing a Recurrent Neural Network (RNN) incorporating a Bidirectional Long Short-Term Memory (BiLSTM) model. Our approach achieved outstanding accuracy results compared to baseline RNN-BiLSTM models. This study contributes to developing recognition systems that could bridge communication barriers for the hearing-impaired community. Through a comparative analysis, we assessed the performance of the TCN and the enhanced RNN architecture in capturing the temporal dependencies and semantic nuances unique to Arabic Sign Language. The models are trained and evaluated using the created dataset of Arabic sign gestures based on recognition accuracy, processing speed, and robustness to variations in signing styles. This research provides insights into the strengths and limitations of TCNs and the enhanced RNN-BiLSTM by investigating their applicability in sign language recognition scenarios. The results indicate that the TCN model achieved an accuracy of 99.5%, while the original RNN-BiLSTM model initially achieved a 96% accuracy but improved to 99% after enhancement. While the accuracy gap between the two models was small, the TCN model demonstrated significant advantages in terms of computational efficiency, requiring fewer resources and achieving faster inference times. These factors make TCNs more practical for real-time sign language recognition applications.
引用
收藏
页数:29
相关论文
共 31 条
[1]   Machine learning methods for sign language recognition: A critical review and analysis [J].
Adeyanju, I. A. ;
Bello, O. O. ;
Adegboye, M. A. .
INTELLIGENT SYSTEMS WITH APPLICATIONS, 2021, 12
[2]  
Ahmadi S.A., 2024, J. Disabil. Res, V3, P20240034, DOI [10.57197/JDR-2024-0034, DOI 10.57197/JDR-2024-0034]
[3]  
Ahmed K., 2024, Int. J. Comput. Digit. Syst, V15, P1
[4]   Integrated Mediapipe with a CNN Model for Arabic Sign Language Recognition [J].
AL Moustafa, Ahmad M. J. ;
Rahim, Mohd Shafry Mohd ;
Bouallegue, Belgacem ;
Khattab, Mahmoud M. ;
Soliman, Amr Mohmed ;
Tharwat, Gamal ;
Ahmed, Abdelmoty M. .
JOURNAL OF ELECTRICAL AND COMPUTER ENGINEERING, 2023, 2023
[5]   Deep Learning for Sign Language Recognition: Current Techniques, Benchmarks, and Open Issues [J].
Al-Qurishi, Muhammad ;
Khalid, Thariq ;
Souissi, Riad .
IEEE ACCESS, 2021, 9 :126917-126951
[6]   Facilitating the communication with deaf people: Building a largest Saudi sign language dataset [J].
Alsulaiman, Mansour ;
Faisal, Mohammed ;
Mekhtiche, Mohamed ;
Bencherif, Mohamed ;
Alrayes, Tariq ;
Muhammad, Ghulam ;
Mathkour, Hassan ;
Abdul, Wadood ;
Alohali, Yousef ;
Alqahtani, Mansour ;
Al-Habib, Habib ;
Alhalafi, Hassan ;
Algabri, Mohammed ;
Al-hammadi, Muneer ;
Altaheri, Hamdi ;
Alfakih, Taha .
JOURNAL OF KING SAUD UNIVERSITY-COMPUTER AND INFORMATION SCIENCES, 2023, 35 (08)
[7]   ML Based Sign Language Recognition System [J].
Amrutha, K. ;
Prabu, P. .
2021 INTERNATIONAL CONFERENCE ON INNOVATIVE TRENDS IN INFORMATION TECHNOLOGY (ICITIIT), 2021,
[8]  
[Anonymous], About Us
[9]  
apd.gov.sa, Population and Housing Census: Individuals with Disabilities
[10]  
Asha M., 2022, UGC Care Group I J, V12, P47