Gesture-based human-robot interaction for human assistance in manufacturing

被引:104
作者
Neto, Pedro [1 ]
Simao, Miguel [1 ,2 ]
Mendes, Nuno [1 ]
Safeea, Mohammad [1 ,2 ]
机构
[1] Univ Coimbra, Dept Mech Engn, Coimbra, Portugal
[2] Arts & Metiers, Lille, France
关键词
Human-robot interaction; Collaborative robotics; Gesture recognition; Intuitive interfaces;
D O I
10.1007/s00170-018-2788-x
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The paradigm for robot usage has changed in the last few years, from a scenario in which robots work isolated to a scenario where robots collaborate with human beings, exploiting and combining the best abilities of robots and humans. The development and acceptance of collaborative robots is highly dependent on reliable and intuitive human-robot interaction (HRI) in the factory floor. This paper proposes a gesture-based HRI framework in which a robot assists a human co-worker delivering tools and parts, and holding objects to/for an assembly operation. Wearable sensors, inertial measurement units (IMUs), are used to capture the human upper body gestures. Captured data are segmented in static and dynamic blocks recurring to an unsupervised sliding window approach. Static and dynamic data blocks feed an artificial neural network (ANN) for static, dynamic, and composed gesture classification. For the HRI interface, we propose a parameterization robotic task manager (PRTM), in which according to the system speech and visual feedback, the co-worker selects/validates robot options using gestures. Experiments in an assembly operation demonstrated the efficiency of the proposed solution.
引用
收藏
页码:119 / 135
页数:17
相关论文
共 41 条
[1]   A Unified Framework for Gesture Recognition and Spatiotemporal Gesture Segmentation [J].
Alon, Jonathan ;
Athitsos, Vassilis ;
Yuan, Quan ;
Sclaroff, Stan .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2009, 31 (09) :1685-1699
[2]  
[Anonymous], 1991, FUNDAMENTALS NONVERB
[3]   Two-handed gesture recognition and fusion with speech to command a robot [J].
Burger, B. ;
Ferrane, I. ;
Lerasle, F. ;
Infantes, G. .
AUTONOMOUS ROBOTS, 2012, 32 (02) :129-147
[4]   Pantomimic Gestures for Human-Robot Interaction [J].
Burke, Michael ;
Lasenby, Joan .
IEEE TRANSACTIONS ON ROBOTICS, 2015, 31 (05) :1225-1237
[5]  
Cacace J, 2016, IEEE INT SYMP SAFE, P233, DOI 10.1109/SSRR.2016.7784304
[6]  
El Makrini I, 2017, IEEE INT C INT ROBOT, P1624, DOI 10.1109/IROS.2017.8205971
[7]  
Ende T, 2011, HUMAN CTR APPROACH R
[8]   Recognizing human motions through mixture modeling of inertial data [J].
Field, Matthew ;
Stirling, David ;
Pan, Zengxi ;
Ros, Montserrat ;
Naghdy, Fazel .
PATTERN RECOGNITION, 2015, 48 (08) :2394-2406
[9]  
Gleeson B, 2013, ACMIEEE INT CONF HUM, P349, DOI 10.1109/HRI.2013.6483609
[10]   The role of gesture in communication and thinking [J].
Goldin-Meadow, S .
TRENDS IN COGNITIVE SCIENCES, 1999, 3 (11) :419-429