Gesture recognition for human-robot collaboration: A review

被引:257
作者
Liu, Hongyi [1 ]
Wang, Lihui [1 ]
机构
[1] KTH Royal Inst Technol, Dept Prod Engn, Stockholm, Sweden
关键词
Human-robot collaboration; Gesture; Gesture recognition; HIDDEN MARKOV-MODELS; NEURAL-NETWORKS; PARTICLE FILTERS; VISUAL TRACKING; IMPOVERISHMENT; REGRESSION; OBJECTS; PARTS;
D O I
10.1016/j.ergon.2017.02.004
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
Recently, the concept of human-robot collaboration has raised many research interests. Instead of robots replacing human workers in workplaces, human-robot collaboration allows human workers and robots working together in a shared manufacturing environment. Human-robot collaboration can release human workers from heavy tasks with assistive robots if effective communication channels between humans and robots are established. Although the communication channels between human workers and robots are still limited, gesture recognition has been effectively applied as the interface between humans and computers for long time. Covering some of the most important technologies and algorithms of gesture recognition, this paper is intended to provide an overview of the gesture recognition research and explore the possibility to apply gesture recognition in human-robot collaborative manufacturing. In this paper, an overall model of gesture recognition for human-robot collaboration is also proposed. There are four essential technical components in the model of gesture recognition for human-robot collaboration: sensor technologies, gesture identification, gesture tracking and gesture classification. Reviewed approaches are classified according to the four essential technical components. Statistical analysis is also presented after technical analysis. Towards the end of this paper, future research trends are outlined. (C) 2017 Elsevier B.V. All rights reserved.
引用
收藏
页码:355 / 367
页数:13
相关论文
共 114 条
[41]  
Haykin S., 2004, Kalman Filtering and Neural Networks
[42]   Support vector machines [J].
Hearst, MA .
IEEE INTELLIGENT SYSTEMS & THEIR APPLICATIONS, 1998, 13 (04) :18-21
[43]  
Howe NR, 2000, ADV NEUR IN, V12, P820
[44]   MoDeep: A Deep Learning Framework Using Motion Features for Human Pose Estimation [J].
Jain, Arjun ;
Tompson, Jonathan ;
LeCun, Yann ;
Bregler, Christoph .
COMPUTER VISION - ACCV 2014, PT II, 2015, 9004 :302-315
[45]   P-N Learning: Bootstrapping Binary Classifiers by Structural Constraints [J].
Kalal, Zdenek ;
Matas, Jiri ;
Mikolajczyk, Krystian .
2010 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2010, :49-56
[46]  
Kalman R., 1960, Journal of Basic Engineering, V82, P35, DOI [DOI 10.1115/1.3662552, 10.1115/1.3662552]
[47]   Hand Gesture Recognition Using Time-of-Flight Camera and Viewpoint Feature Histogram [J].
Kapuscinski, Tomasz ;
Oszust, Mariusz ;
Wysocki, Marian .
INTELLIGENT SYSTEMS IN TECHNICAL AND MEDICAL DIAGNOSTICS, 2014, 230 :403-414
[48]  
Katsuki Yugo, 2015, P 10 ANN ACMIEEE INT, P117
[49]  
Keogh E. J., 2001, Derivative Dynamic Time Warping, P1, DOI [DOI 10.1137/1.9781611972719.1, 10.1137/1.9781611972719.1]
[50]   Cooperation of human and machines in assembly lines [J].
Krueger, J. ;
Lien, T. K. ;
Verl, A. .
CIRP ANNALS-MANUFACTURING TECHNOLOGY, 2009, 58 (02) :628-646