Hand gesture recognition using Leap Motion via deterministic learning

被引:37
作者
Zeng, Wei [1 ]
Wang, Cong [2 ]
Wang, Qinghui [1 ]
机构
[1] Longyan Univ, Sch Mech & Elect Engn, Longyan 364012, Peoples R China
[2] South China Univ Technol, Sch Automat Sci & Engn, Guangzhou 510640, Guangdong, Peoples R China
基金
中国国家自然科学基金;
关键词
Hand gesture recognition; Deterministic learning; Leap Motion; Hand motion dynamics; RBF neural networks; MODEL; REPRESENTATION; RETRIEVAL; SYSTEM;
D O I
10.1007/s11042-018-5998-1
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
With the development of multimedia technology, traditional interactive tools, such as mouse and keyboard, cannot satisfy users' requirements. Touchless interaction has received considerable attention in recent years with benefit of removing barriers of physical contact. Leap Motion is an interactive device which can be used to collect information of dynamic hand gestures, including coordinate, acceleration and direction of fingers. The aim of this study is to develop a new method for hand gesture recognition using jointly calibrated Leap Motion via deterministic learning. Hand gesture features representing hand motion dynamics, including spatial position and direction of fingers, are derived from Leap Motion. Hand motion dynamics underlying motion patterns of different gestures which represent Arabic numbers (0-9) and capital English alphabets (A-Z) are modeled by constant radial basis function (RBF) neural networks. Then, a bank of estimators is constructed by the constant RBF networks. By comparing the set of estimators with a test gesture pattern, a set of recognition errors are generated. The average L (1) norms of the errors are taken as the recognition measure according to the smallest error principle. Finally, experiments are carried out to demonstrate the high recognition performance of the proposed method. By using the 2-fold, 10-fold and leave-one-person-out cross-validation styles, the correct recognition rates for the Arabic numbers are reported to be 94.2%, 95.1% and 90.2%, respectively, for the English alphabets are reported to be 89.2%, 92.9% and 86.4%, respectively.
引用
收藏
页码:28185 / 28206
页数:22
相关论文
共 57 条
[1]  
[Anonymous], 2016, P 27 BRIT MACHINE VI
[2]  
[Anonymous], 2012, P 27 C IMAGE VISION
[3]   Rule-based trajectory segmentation for modeling hand motion trajectory [J].
Beh, Jounghoon ;
Han, David ;
Ko, Hanseok .
PATTERN RECOGNITION, 2014, 47 (04) :1586-1601
[4]   Hidden Markov Model on a unit hypersphere space for gesture trajectory recognition [J].
Beh, Jounghoon ;
Han, David K. ;
Durasiwami, Ramani ;
Ko, Hanseok .
PATTERN RECOGNITION LETTERS, 2014, 36 :144-153
[5]   Feature Processing and Modeling for 6D Motion Gesture Recognition [J].
Chen, Mingyu ;
AlRegib, Ghassan ;
Juang, Biing-Hwang .
IEEE TRANSACTIONS ON MULTIMEDIA, 2013, 15 (03) :561-571
[6]   Survey on 3D Hand Gesture Recognition [J].
Cheng, Hong ;
Yang, Lu ;
Liu, Zicheng .
IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2016, 26 (09) :1659-1673
[7]   Real-Time Scalable Visual Tracking via Quadrangle Kernelized Correlation Filters [J].
Ding, Guiguang ;
Chen, Wenshuo ;
Zhao, Sicheng ;
Han, Jungong ;
Liu, Qiaoyan .
IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2018, 19 (01) :140-150
[8]   Large-scale image retrieval with Sparse Embedded Hashing [J].
Ding, Guiguang ;
Zhou, Jile ;
Guo, Yuchen ;
Lin, Zijia ;
Zhao, Sicheng ;
Han, Jungong .
NEUROCOMPUTING, 2017, 257 :24-36
[9]   Markerless Human-Manipulator Interface Using Leap Motion With Interval Kalman Filter and Improved Particle Filter [J].
Du, Guanglong ;
Zhang, Ping ;
Liu, Xin .
IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2016, 12 (02) :694-704
[10]  
Elmezain M, 2007, 2007 IEEE INTERNATIONAL SYMPOSIUM ON SIGNAL PROCESSING AND INFORMATION TECHNOLOGY, VOLS 1-3, P1170