Gesture recognition using a depth camera for human robot collaboration on assembly line

被引:38
作者
Coupete, Eva [1 ]
Moutarde, Fabien [1 ]
Manitsaris, Sotiris [1 ]
机构
[1] Mines ParisTech, Robot Lab CAOR, 60 Blvd St Michel, F-75006 Paris, France
来源
6TH INTERNATIONAL CONFERENCE ON APPLIED HUMAN FACTORS AND ERGONOMICS (AHFE 2015) AND THE AFFILIATED CONFERENCES, AHFE 2015 | 2015年 / 3卷
关键词
Human-robot collaboration; Depth camera; Gesture recognition; Hands tracking;
D O I
10.1016/j.promfg.2015.07.216
中图分类号
B84 [心理学]; C [社会科学总论]; Q98 [人类学];
学科分类号
03 ; 0303 ; 030303 ; 04 ; 0402 ;
摘要
We present a framework and preliminary experimental results for technical gestures recognition using a RGB-D camera. We have studied a collaborative task between a robot and an operator: the assembly of a motor hoses. The goal is to enable the robot to understand which task has just been executed by a human operator in order to anticipate on his actions, to adapt his speed and react properly if an unusual event occurs. The depth camera is placed above the operator, to minimize the possible occlusion on an assembly line, and we track the head and the hands of the operator using the geodesic distance between the head and the pixels of his torso. To describe his movements we used the shape of the shortest routes joining the head and the hands. We then used a discreet HMM to learn and recognize five gestures performed during the motor hoses assembly. By using gesture from the same operator for the learning and the recognition, we reach a good recognition rate of 93%. These results are encouraging and ongoing work will lead us to experiment our set up on a larger pool of operators and recognize the gesture in real time. (C) 2015 The Authors. Published by Elsevier B.V.
引用
收藏
页码:518 / 525
页数:8
相关论文
共 28 条
[1]  
[Anonymous], P BRIT MACH VIS C BM
[2]  
[Anonymous], P 23 IEEE INT S ROB
[3]   The recognition of human movement using temporal templates [J].
Bobick, AF ;
Davis, JW .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2001, 23 (03) :257-267
[4]  
Bregonzio M, 2009, PROC CVPR IEEE, P1948, DOI 10.1109/CVPRW.2009.5206779
[5]  
Chen Q. C. Q., 2007, PROC IEEE C INSTRUME, P1
[6]  
Dijkstra E. W., 1959, NUMER MATH, V1, P269, DOI [10.1007/BF01386390, DOI 10.1007/BF01386390]
[7]  
Dollar P., 2005, Proceedings. 2nd Joint IEEE International Workshop on Visual Surveillance and Performance Evaluation of Tracking and Surveillance (VS-PETS) (IEEE Cat. No. 05EX1178), P65
[8]  
Fang YK, 2007, 2007 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO, VOLS 1-5, P995
[9]  
Faria DR, 2014, IEEE ROMAN, P732, DOI 10.1109/ROMAN.2014.6926340
[10]  
Haddadin S, 2011, SPRINGER TRAC ADV RO, V70, P261