A computer vision-based system for recognition and classification of Urdu sign language dataset

被引:0
|
作者
Zahid H. [1 ,5 ]
Rashid M. [2 ]
Syed S.A. [3 ]
Ullah R. [4 ]
Asif M. [5 ]
Khan M. [3 ]
Mujeeb A.A. [6 ]
Khan A.H. [6 ]
机构
[1] Biomedical Engineering Department and Electrical Engineering Department, Ziauddin University, Karachi
[2] Electrical Engineering Department and Software Engineering Department, Ziauddin University, Karachi
[3] Biomedical Engineering Department, Sir Syed University of Engineering and Technology, Karachi
[4] Optimizia, Karachi
[5] Electrical Engineering Department, Ziauddin University, Karachi
[6] Biomedical Engineering Department, Ziauddin University, Karachi
关键词
Bag of words; KNN; Pattern recognition; Random Forest; Sign language; SVM; Urdu sign language;
D O I
10.7717/PEERJ-CS.1174
中图分类号
学科分类号
摘要
Human beings rely heavily on social communication as one of the major aspects of communication. Language is the most effective means of verbal and nonverbal communication and association. To bridge the communication gap between deaf people communities, and non-deaf people, sign language is widely used. According to the World Federation of the Deaf, there are about 70 million deaf people present around the globe and about 300 sign languages being used. Hence, the structural form of the hand gestures involving visual motions and signs is used as a communication system to help the deaf and speech-impaired community for daily interaction. The aim is to collect a dataset of Urdu sign language (USL) and test it through a machine learning classifier. The overview of the proposed system is divided into four main stages i.e., data collection, data acquisition, training model ad testing model. The USL dataset which is comprised of 1,560 images was created by photographing various hand positions using a camera. This work provides a strategy for automated identification of USL numbers based on a bag-of-words (BoW) paradigm. For classification purposes, support vector machine (SVM), Random Forest, and K-nearest neighbor (K-NN) are used with the BoWhistogram bin frequencies as characteristics. The proposed technique outperforms others in number classification, attaining the accuracies of 88%, 90%, and 84% for the random forest, SVM, and K-NN respectively. © 2022 Zahid et al.
引用
收藏
相关论文
共 50 条
  • [21] Teaching a Robot Sign Language using Vision-Based Hand Gesture Recognition
    Zhi, Da
    de Oliveira, Thiago E. Alves
    da Fonseca, Vinicius Prado
    Petriu, Emil M.
    2018 IEEE INTERNATIONAL CONFERENCE ON COMPUTATIONAL INTELLIGENCE AND VIRTUAL ENVIRONMENTS FOR MEASUREMENT SYSTEMS AND APPLICATIONS (CIVEMSA), 2018,
  • [22] Real-time computer vision-based gestures recognition system for bangla sign language using multiple linguistic features analysis
    Rahaman, Muhammad Aminur
    Ali, Md. Haider
    Hasanuzzaman, Md.
    Multimedia Tools and Applications, 2024, 83 (08) : 22261 - 22294
  • [23] Real-time computer vision-based gestures recognition system for bangla sign language using multiple linguistic features analysis
    Rahaman, Muhammad Aminur
    Ali, Md. Haider
    Hasanuzzaman, Md.
    MULTIMEDIA TOOLS AND APPLICATIONS, 2023, 83 (8) : 22261 - 22294
  • [24] Vision-based continuous sign language recognition using multimodal sensor fusion
    Jebali, Maher
    Dakhli, Abdesselem
    Jemni, Mohammed
    EVOLVING SYSTEMS, 2021, 12 (04) : 1031 - 1044
  • [25] Chinese Sign Language Recognition for a Vision-Based Multi-features Classifier
    Yang Quan
    Peng Jinye
    ISCSCT 2008: INTERNATIONAL SYMPOSIUM ON COMPUTER SCIENCE AND COMPUTATIONAL TECHNOLOGY, VOL 2, PROCEEDINGS, 2008, : 194 - +
  • [26] Real-time computer vision-based gestures recognition system for bangla sign language using multiple linguistic features analysis
    Muhammad Aminur Rahaman
    Md. Haider Ali
    Md. Hasanuzzaman
    Multimedia Tools and Applications, 2024, 83 : 22261 - 22294
  • [27] Recognition of Urdu sign language: a systematic review of the machine learning classification
    Zahid H.
    Rashid M.
    Hussain S.
    Azim F.
    Syed S.A.
    Saad A.
    PeerJ Computer Science, 2022, 8
  • [28] Recognition of Urdu sign language: a systematic review of the machine learning classification
    Zahid, Hira
    Rashid, Munaf
    Hussain, Samreen
    Azim, Fahad
    Syed, Sidra Abid
    Saad, Afshan
    PEERJ COMPUTER SCIENCE, 2022, 8
  • [29] VISION-BASED SIGN LANGUAGE TRANSLATION DEVICE
    Madhuri, Yellapu
    Anitha, G.
    Anburajan, M.
    2013 INTERNATIONAL CONFERENCE ON INFORMATION COMMUNICATION AND EMBEDDED SYSTEMS (ICICES), 2013, : 565 - 568
  • [30] Vision-Based Multilingual Sign Language Translation
    Ghotkar A.
    Barde U.
    Sonawane S.
    Gokhale A.
    SN Computer Science, 4 (6)