A comprehensive survey and taxonomy of sign language research

被引:21
作者
El-Alfy, El-Sayed M. [1 ]
Luqman, Hamzah
机构
[1] King Fahd Univ Petr & Minerals, Coll Comp & Math, Informat & Comp Sci Dept, Dhahran, Saudi Arabia
关键词
Sign language recognition; Sign language translation; Manual gestures; Non -manual gestures; Sign language database; HAND GESTURE RECOGNITION; DISCRETE WAVELET TRANSFORM; SENTENCE RECOGNITION; FEATURE-EXTRACTION; SYSTEM; FRAMEWORK; HEARING; COMBINATION; ALGORITHM; NETWORKS;
D O I
10.1016/j.engappai.2022.105198
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Sign language relies on visual gestures of human body parts to convey meaning and plays a vital role in modern society to communicate and interact with people having hearing difficulty as well as for human-machine interaction applications. This field has attracted a growing attention in recent years and several research outcomes have been witnessed covering various issues including sign acquisition, segmentation, recognition, translation and linguistic structures. In this paper, a comprehensive up-to-date survey of the state-of-the-art literature of automated sign language processing is presented. The survey provides a taxonomy and review of the body of knowledge and research efforts with focus on acquisition devices, available databases, and recognition techniques for fingerspelling signs, isolated sign words, and continuous sentence recognition systems. It covers recent advances including deep machine learning and multimodal approaches and discusses various related challenges. This survey is directed to junior researchers and industry developers working on sign language gesture recognition and related systems to gain insights and identify distinctive aspects and current status of existing landscape as well as future perspectives leading to further advancements.
引用
收藏
页数:17
相关论文
共 210 条
[51]  
Fathy Ghada Dahy, 2015, 7th International Conference on Information Technology, P164, DOI 10.15849/icit.2015.0024
[52]  
Forster J, 2012, LREC 2012 - EIGHTH INTERNATIONAL CONFERENCE ON LANGUAGE RESOURCES AND EVALUATION, P3785
[53]  
Forster J, 2013, LECT NOTES COMPUT SC, V7887, P89
[54]   Perspective and Evolution of Gesture Recognition for Sign Language: A Review [J].
Galvan-Ruiz, Jesus ;
Travieso-Gonzalez, Carlos M. ;
Tejera-Fettmilch, Acaymo ;
Pinan-Roescher, Alejandro ;
Esteban-Hernandez, Luis ;
Dominguez-Quintana, Luis .
SENSORS, 2020, 20 (12) :1-31
[55]  
Gao W, 2004, SIXTH IEEE INTERNATIONAL CONFERENCE ON AUTOMATIC FACE AND GESTURE RECOGNITION, PROCEEDINGS, P553
[56]  
Gao W, 2004, PATTERN RECOGN, V37, P2389, DOI 10.1016/j.patcog.2004.04.008
[57]  
Gaolin Fang, 2002, Gesture and Sign Language in Human-Computer Interaction. International Gesture Workshop, GW 2001. Revised Papers (Lecture Notes in Artificial Intelligence Vol.2298), P76
[58]   Dynamic Hand Gesture Recognition and Novel Sentence Interpretation Algorithm for Indian Sign Language Using Microsoft Kinect Sensor [J].
Ghotkar, Archana S. ;
Kharate, Gajanan K. .
JOURNAL OF PATTERN RECOGNITION RESEARCH, 2015, 10 (01) :24-38
[59]   Knowledge of American sign language and the ability of hearing individuals to decode facial expressions of emotion [J].
Goldstein, NE ;
Feldman, RS .
JOURNAL OF NONVERBAL BEHAVIOR, 1996, 20 (02) :111-122
[60]  
Guo D, 2019, PROCEEDINGS OF THE TWENTY-EIGHTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, P744