An Accurate Prediction Method of Human Assembly Motion for Human-Robot Collaboration

被引:0
作者
Zhou, Yangzheng [1 ]
Luo, Liang [2 ]
Li, Pengzhong [1 ]
机构
[1] Tongji Univ, Sch Mech Engn, Shanghai 200092, Peoples R China
[2] Tongji Univ, Sino German Coll Postgrad Studies, Shanghai 200092, Peoples R China
来源
SYMMETRY-BASEL | 2024年 / 16卷 / 01期
关键词
human motion prediction; human-robot collaboration; collaborative assembly; real time; TRAJECTORY PREDICTION;
D O I
10.3390/sym16010118
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
In the process of human-robot collaborative assembly, robots need to recognize and predict human behaviors accurately, and then perform autonomous control and work route planning in real-time. To support the judgment of human intervention behaviors and meet the need of real-time human-robot collaboration, the Fast Spatial-Temporal Transformer Network (FST-Trans), an accurate prediction method of human assembly actions, is proposed. We tried to maximize the symmetry between the prediction results and the actual action while meeting the real-time requirement. With concise and efficient structural design, FST-Trans can learn about the spatial-temporal interactions of human joints during assembly in the same latent space and capture more complex motion dynamics. Considering the inconsistent assembly rates of different individuals, the network is forced to learn more motion variations by introducing velocity-acceleration loss, realizing accurate prediction of assembly actions. An assembly dataset was collected and constructed for detailed comparative experiments and ablation studies, and the experimental results demonstrate the effectiveness of the proposed method.
引用
收藏
页数:15
相关论文
共 41 条
[1]  
Akhter I., 2008, ADV NEURAL INFORM PR, P41
[2]  
Braganca S., 2019, STUD SYST DECIS CONT, P641, DOI DOI 10.1007/978-3-030-14730-368
[3]   Towards Efficient Human-Robot Collaboration With Robust Plan Recognition and Trajectory Prediction [J].
Cheng, Yujiao ;
Sun, Liting ;
Liu, Changliu ;
Tomizuka, Masayoshi .
IEEE ROBOTICS AND AUTOMATION LETTERS, 2020, 5 (02) :2602-2609
[4]   Human Motion Prediction using Semi-adaptable Neural Networks [J].
Cheng, Yujiao ;
Zhao, Weiye ;
Liu, Changliu ;
Tomizuka, Masayoshi .
2019 AMERICAN CONTROL CONFERENCE (ACC), 2019, :4884-4890
[5]   Preemptive Motion Planning for Human-to-Robot Indirect Placement Handovers [J].
Choi, Andrew ;
Jawed, Mohammad Khalid ;
Joo, Jungseock .
2022 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA 2022), 2022, :4743-4749
[6]   Learning Dynamic Relationships for 3D Human Motion Prediction [J].
Cui, Qiongjie ;
Sun, Huaijiang ;
Yang, Fei .
2020 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2020, :6518-6526
[7]   MSR-GCN: Multi-Scale Residual Graph Convolution Networks for Human Motion Prediction [J].
Dang, Lingwei ;
Nie, Yongwei ;
Long, Chengjiang ;
Zhang, Qing ;
Li, Guiqing .
2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, :11447-11456
[8]  
El-Shamouty M, 2020, IEEE INT CONF ROBOT, P4899, DOI [10.1109/icra40945.2020.9196924, 10.1109/ICRA40945.2020.9196924]
[9]   Recurrent Network Models for Human Dynamics [J].
Fragkiadaki, Katerina ;
Levine, Sergey ;
Felsen, Panna ;
Malik, Jitendra .
2015 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2015, :4346-4354
[10]   Sustainable Green Human Resource Management Practices in Educational Institutions: An Interpretive Structural Modelling and Analytic Hierarchy Process Approach [J].
Goel, Pankaj ;
Mehta, Sandhya ;
Kumar, Raman ;
Castano, Fernando .
SUSTAINABILITY, 2022, 14 (19)