Real-time gesture recognition based on feature recalibration network with multi-scale information

被引:13
作者
Cao, Zhengcai [1 ]
Xu, Xiaowen [1 ]
Hu, Biao [1 ]
Zhou, Meng [1 ]
Li, Qinglin [1 ]
机构
[1] Beijing Univ Chem Technol, Coll Informat Sci & Technol, Beijing 100029, Peoples R China
基金
中国国家自然科学基金; 国家重点研发计划;
关键词
Gesture recognition; Human-machine interaction; Deep convolutional network; Contextual information;
D O I
10.1016/j.neucom.2019.03.019
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Gesture recognition is important in human-machine interaction. Current methods for solving gesture recognition have several disadvantages such as low recognition rate, slow speed and poor performance on recognizing multiple targets or long-distance targets in complex environments. In view of the above problems, we propose a gesture recognition approach that can recognize gestures quickly and accurately from complex background. This approach works on a deep convolutional network, which consists of a basic network module for extracting feature information, a squeeze-and-excitation networks for increasing feature channel affinity and a feature pyramid attention module for fusing context information with different scales. To test the proposed approach, we make a data set that contains 3289 images from difference complex scenes. Generally gestures in those images can be generally classified into 16 types. We have uploaded this data set for researchers use. Experimental results demonstrate that the recognition accuracy and speed of the proposed method can achieve 83.45% and 32.2 frames per second respectively, which has better comprehensive performance compared with other state-of-the-art recognition algorithms. (C) 2019 Elsevier B.V. All rights reserved.
引用
收藏
页码:119 / 130
页数:12
相关论文
共 46 条
[1]  
Alani AA, 2018, 2018 4TH INTERNATIONAL CONFERENCE ON INFORMATION MANAGEMENT (ICIM2018), P5, DOI 10.1109/INFOMAN.2018.8392660
[2]  
[Anonymous], NEURAL COMPUTING APP
[3]  
[Anonymous], 2018, COMPUTER VISION PATT
[4]  
[Anonymous], P COMP VIS PATT REC
[5]  
[Anonymous], P 2016 6 INT C DIG H
[6]  
[Anonymous], P IEEE INT C ROB AUT
[7]  
[Anonymous], P ISPRS COMM TECHN C
[8]  
[Anonymous], P COMP VIS PATT REC
[9]  
[Anonymous], 2012, 2012 IEEE COMP SOC C, DOI DOI 10.1109/CVPRW.2012.6239179
[10]  
[Anonymous], PROC CVPR IEEE