A comprehensive survey and taxonomy of sign language research

被引:21
作者
El-Alfy, El-Sayed M. [1 ]
Luqman, Hamzah
机构
[1] King Fahd Univ Petr & Minerals, Coll Comp & Math, Informat & Comp Sci Dept, Dhahran, Saudi Arabia
关键词
Sign language recognition; Sign language translation; Manual gestures; Non -manual gestures; Sign language database; HAND GESTURE RECOGNITION; DISCRETE WAVELET TRANSFORM; SENTENCE RECOGNITION; FEATURE-EXTRACTION; SYSTEM; FRAMEWORK; HEARING; COMBINATION; ALGORITHM; NETWORKS;
D O I
10.1016/j.engappai.2022.105198
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Sign language relies on visual gestures of human body parts to convey meaning and plays a vital role in modern society to communicate and interact with people having hearing difficulty as well as for human-machine interaction applications. This field has attracted a growing attention in recent years and several research outcomes have been witnessed covering various issues including sign acquisition, segmentation, recognition, translation and linguistic structures. In this paper, a comprehensive up-to-date survey of the state-of-the-art literature of automated sign language processing is presented. The survey provides a taxonomy and review of the body of knowledge and research efforts with focus on acquisition devices, available databases, and recognition techniques for fingerspelling signs, isolated sign words, and continuous sentence recognition systems. It covers recent advances including deep machine learning and multimodal approaches and discusses various related challenges. This survey is directed to junior researchers and industry developers working on sign language gesture recognition and related systems to gain insights and identify distinctive aspects and current status of existing landscape as well as future perspectives leading to further advancements.
引用
收藏
页数:17
相关论文
共 210 条
[91]  
Krnoul Z, 2010, INT CONF SIGN PROCES, P732, DOI 10.1109/ICOSP.2010.5655761
[92]  
Kumar B. P., 2017, INDIAN J SCI TECHNOL, V10, DOI DOI 10.17485/ijst/2017/v10i1/109389
[93]   Independent Bayesian classifier combination based sign language recognition using facial expression [J].
Kumar, Pradeep ;
Roy, Partha Pratim ;
Dogra, Debi Prosad .
INFORMATION SCIENCES, 2018, 428 :30-48
[94]  
Kurakin A, 2012, EUR SIGNAL PR CONF, P1975
[95]   ArASL: Arabic Alphabets Sign Language Dataset [J].
Latif, Ghazanfar ;
Mohammad, Nazeeruddin ;
Alghazo, Jaafar ;
AlKhalaf, Roaa ;
AlKhalaf, Rawan .
DATA IN BRIEF, 2019, 23
[96]  
Li DX, 2020, IEEE WINT CONF APPL, P1448, DOI [10.1109/wacv45572.2020.9093512, 10.1109/WACV45572.2020.9093512]
[97]   Feature learning based on SAE-PCA network for human gesture recognition in RGBD images [J].
Li, Shao-Zi ;
Yu, Bin ;
Wu, Wei ;
Su, Song-Zhi ;
Ji, Rong-Rong .
NEUROCOMPUTING, 2015, 151 :565-573
[98]   A real-time continuous gesture recognition system for sign language [J].
Liang, RH ;
Ouhyoung, M .
AUTOMATIC FACE AND GESTURE RECOGNITION - THIRD IEEE INTERNATIONAL CONFERENCE PROCEEDINGS, 1998, :558-567
[99]   Dynamic Sign Language Recognition Based on Video Sequence With BLSTM-3D Residual Networks [J].
Liao, Yanqiu ;
Xiong, Pengwen ;
Min, Weidong ;
Min, Weiqiong ;
Lu, Jiahao .
IEEE ACCESS, 2019, 7 :38044-38054
[100]   Isolated sign language recognition using Convolutional Neural Network hand modelling and Hand Energy Image [J].
Lim, Kian Ming ;
Tan, Alan Wee Chiat ;
Lee, Chin Poo ;
Tan, Shing Chiang .
MULTIMEDIA TOOLS AND APPLICATIONS, 2019, 78 (14) :19917-19944