Towards Multi-User Activity Recognition through Facilitated Training Data and Deep Learning for Human-Robot Collaboration Applications

被引:2
作者
Semeraro, Francesco [1 ]
Carberry, Jon [2 ]
Cangelosi, Angelo [1 ]
机构
[1] Univ Manchester, Manchester Ctr Robot &, Manchester, Lancs, England
[2] BAE Syst Plc, BAE Syst Operat Ltd, Warton, England
来源
2023 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN | 2023年
基金
英国工程与自然科学研究理事会; 欧盟地平线“2020”;
关键词
multi-user activity recognition; single-user training data; concurrent tasks; multi-party human-robot collaboration; non-dyadic human-robot collaboration; deep learning; long short-term memory; variational autoencoder; spatio-temporal graph convolutional network; transfer learning; HRI;
D O I
10.1109/IJCNN54540.2023.10191782
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Human-robot interaction (HRI) research is progressively addressing multi-party scenarios, where a robot interacts with more than one human user at the same time. Conversely, research is still at an early stage for human-robot collaboration. The use of machine learning techniques to handle such type of collaboration requires data that are less feasible to produce than in a typical HRC setup. This work outlines scenarios of concurrent tasks for non-dyadic HRC applications. Based upon these concepts, this study also proposes an alternative way of gathering data regarding multi-user activity, by collecting data related to single users and merging them in post-processing, to reduce the effort involved in producing recordings of pair settings. To validate this statement, 3D skeleton poses of activity of single users were collected and merged in pairs. After this, such datapoints were used to separately train a long shortterm memory (LSTM) network and a variational autoencoder (VAE) composed of spatio-temporal graph convolutional networks (STGCN) to recognise the joint activities of the pairs of people. The results showed that it is possible to make use of data collected in this way for pair HRC settings and get similar performances compared to using training data regarding groups of users recorded under the same settings, relieving from the technical difficulties involved in producing these data. The related code and collected data are publicly available(1).
引用
收藏
页数:9
相关论文
共 45 条
[1]   Progress and prospects of the human-robot collaboration [J].
Ajoudani, Arash ;
Zanchettin, Andrea Maria ;
Ivaldi, Serena ;
Albu-Schaeffer, Alin ;
Kosuge, Kazuhiro ;
Khatib, Oussama .
AUTONOMOUS ROBOTS, 2018, 42 (05) :957-975
[2]  
Al Moubayed Samer, 2012, Cognitive Behavioural Systems (COST 2012). International Training School. Revised Selected Papers, P114, DOI 10.1007/978-3-642-34584-5_9
[3]  
[Anonymous], 2017, ARXIV170201638
[4]  
Baxter P, 2016, ACMIEEE INT CONF HUM, P391, DOI 10.1109/HRI.2016.7451777
[5]   Are Robots Ready for Administering Health Status Surveys? First Results from an HRI Study with Subjects with Parkinson's Disease [J].
Briggs, Priscilla ;
Scheutz, Matthias ;
Tickle-Degnen, Linda .
PROCEEDINGS OF THE 2015 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION (HRI'15), 2015, :327-334
[6]  
Casserfelt K, 2019, INT CONF PERVAS COMP, P58, DOI [10.1109/PERCOMW.2019.8730589, 10.1109/percomw.2019.8730589]
[7]   Mood classification through physiological parameters [J].
Cavallo, Filippo ;
Semeraro, Francesco ;
Mancioppi, Gianmaria ;
Betti, Stefano ;
Fiorini, Laura .
JOURNAL OF AMBIENT INTELLIGENCE AND HUMANIZED COMPUTING, 2021, 12 (04) :4471-4484
[8]   Development and Application of a Networked Automatic Deformation Monitoring System [J].
Fan, Wei ;
Pan, Guorong ;
Wang, Lei .
JOURNAL OF GEOVISUALIZATION AND SPATIAL ANALYSIS, 2020, 4 (01)
[9]  
Feil-Seifer D., 2020, Where to next? The impact of COVID-19 on human-robot interaction research
[10]   Physiological Sensor System for the Detection of Human Moods Towards Internet of Robotic Things Applications [J].
Fiorini, Laura ;
Semeraro, Francesco ;
Mancioppi, Gianmaria ;
Betti, Stefano ;
Santarelli, Luca ;
Cavallo, Filippo .
NEW TRENDS IN INTELLIGENT SOFTWARE METHODOLOGIES, TOOLS AND TECHNIQUES (SOMET_18), 2018, 303 :967-980