Towards a Safe Human-Robot Collaboration Using Information on Human Worker Activity

被引:6
作者
Orsag, Luka [1 ]
Stipancic, Tomislav [1 ]
Koren, Leon [1 ]
机构
[1] Univ Zagreb, Fac Mech Engn & Naval Architecture, Ivana Luc 5, Zagreb 10000, Croatia
关键词
human-robot collaboration; activity recognition; deep learning; LSTM; safe HCI; adaptive manufacturing systems; robotics; RECOGNITION;
D O I
10.3390/s23031283
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
Most industrial workplaces involving robots and other apparatus operate behind the fences to remove defects, hazards, or casualties. Recent advancements in machine learning can enable robots to co-operate with human co-workers while retaining safety, flexibility, and robustness. This article focuses on the computation model, which provides a collaborative environment through intuitive and adaptive human-robot interaction (HRI). In essence, one layer of the model can be expressed as a set of useful information utilized by an intelligent agent. Within this construction, a vision-sensing modality can be broken down into multiple layers. The authors propose a human-skeleton-based trainable model for the recognition of spatiotemporal human worker activity using LSTM networks, which can achieve a training accuracy of 91.365%, based on the InHARD dataset. Together with the training results, results related to aspects of the simulation environment and future improvements of the system are discussed. By combining human worker upper body positions with actions, the perceptual potential of the system is increased, and human-robot collaboration becomes context-aware. Based on the acquired information, the intelligent agent gains the ability to adapt its behavior according to its dynamic and stochastic surroundings.
引用
收藏
页数:16
相关论文
共 49 条
[1]   Exploring the influence of a user-specific explainable virtual advisor on health behaviour change intentions [J].
Abdulrahman, Amal ;
Richards, Deborah ;
Bilgin, Ayse Aysin .
AUTONOMOUS AGENTS AND MULTI-AGENT SYSTEMS, 2022, 36 (01)
[2]   Progress and prospects of the human-robot collaboration [J].
Ajoudani, Arash ;
Zanchettin, Andrea Maria ;
Ivaldi, Serena ;
Albu-Schaeffer, Alin ;
Kosuge, Kazuhiro ;
Khatib, Oussama .
AUTONOMOUS ROBOTS, 2018, 42 (05) :957-975
[3]   Intention Based Comparative Analysis of Human-Robot Interaction [J].
Awais, Muhammad ;
Saeed, Muhammad Yahya ;
Malik, Muhammad Sheraz Arshad ;
Younas, Muhammad ;
Rao Iqbal Asif, Sohail .
IEEE ACCESS, 2020, 8 :205821-205835
[4]   Window Size Impact in Human Activity Recognition [J].
Banos, Oresti ;
Galvez, Juan-Manuel ;
Damas, Miguel ;
Pomares, Hector ;
Rojas, Ignacio .
SENSORS, 2014, 14 (04) :6474-6499
[5]   Safety assurance mechanisms of collaborative robotic systems in manufacturing [J].
Bi, Z. M. ;
Luo, Chaomin ;
Miao, Zhonghua ;
Zhang, Bing ;
Zhang, W. J. ;
Wang, Lihui .
ROBOTICS AND COMPUTER-INTEGRATED MANUFACTURING, 2021, 67
[6]   A Tutorial on Human Activity Recognition Using Body-Worn Inertial Sensors [J].
Bulling, Andreas ;
Blanke, Ulf ;
Schiele, Bernt .
ACM COMPUTING SURVEYS, 2014, 46 (03)
[7]  
Carreira J., 2019, ARXIV
[8]   Modeling Preferences through Personality and Satisfaction to Guide the Decision Making of a Virtual Agent [J].
Castro-Rivera, Jorge ;
Lucila Morales-Rodriguez, Maria ;
Rangel-Valdez, Nelson ;
Gomez-Santillan, Claudia ;
Aguilera-Vazquez, Luciano .
AXIOMS, 2022, 11 (05)
[9]  
Chandrasekaran B., 2015, SOUTHEASTCON 2015, P1, DOI 10.1109/secon.2015.7132964
[10]   Intention Recognition in Human Robot Interaction Based on Eye Tracking [J].
Cubero, Carlos Gomez ;
Rehm, Matthias .
HUMAN-COMPUTER INTERACTION, INTERACT 2021, PT III, 2021, 12934 :428-437