HRC of intelligent assembly system based on multimodal gesture control

被引:0
作者
Jianguo Duan
Yuan Fang
Qinglei Zhang
Jiyun Qin
机构
[1] Shanghai Maritime University,China Institute of FTZ Supply Chain
[2] Shanghai Maritime University,Logistics Engineering College
来源
The International Journal of Advanced Manufacturing Technology | 2023年 / 127卷
关键词
HRC; Gesture control; Multimodal perception; Integrally shrouded blade;
D O I
暂无
中图分类号
学科分类号
摘要
As a natural expression of the human body, gestures are widely used in robot control. However, most gesture control methods rely only on single-modal feature representation, which has certain instability. To this end, this paper proposes a human-robot collaboration (HRC) framework for intelligent assembly systems with multimodal gesture control. The framework aims to perform gesture recognition on RGB data and RGB-D data through convolutional neural networks (CNN) and combine gesture data from both modalities for application in robot control, enabling data fusion and feature sharing. Perform asynchronous integration of image acquisition and gesture detection with spatially and time-aligned RGB frames and RGB-D frames to ensure real-time detection of gestures, and speed and separation monitoring to ensure the safety of the collaboration process. To verify the effectiveness of this framework, it was applied to the assembly scene of the integrally shrouded blade-rotor system. Experiments show that the HRC of the intelligent assembly system with multimodal gesture control can better realize the human-robot collaborative assembly with the integrally shrouded blade and improve the intelligence of the assembly process.
引用
收藏
页码:4307 / 4319
页数:12
相关论文
共 65 条
[1]  
Gervasi R(2020)A conceptual framework to evaluate human-robot collaboration Int J Adv Manuf Technol 108 841-865
[2]  
Mastrogiacomo L(2018)Gesture recognition for human-robot collaboration: a review Int J Ind Ergon 68 355-367
[3]  
Franceschini F(2019)Basic daily activity recognition with a data glove Procedia Comput Sci 151 108-115
[4]  
Liu H(2018)Recognition of sign language alphabets and numbers based on hand kinematics using a data glove Procedia Comput Sci 133 55-62
[5]  
Wang L(2022)3D hand pose and shape estimation from RGB images for key point-based hand gesture recognition Pattern Recogn 129 108762-135
[6]  
Maitre J(2021)Vision-based hand gesture recognition using deep learning for the interpretation of sign language Expert Syst Appl 182 115657-185
[7]  
Rendu C(2020)Gesture recognition based on deep deformable 3D convolutional neural networks Pattern Recogn 107 107416-525
[8]  
Bouchard K(2019)Gesture-based human-robot interaction for human assistance in manufacturing Int J Adv Manuf Technol 101 119-56
[9]  
Bouchard B(2019)Gesture-based Interface for real-time control of a Mitsubishi SCARA robot manipulator IFAC-Papers Online 52 180-P74771
[10]  
Gaboury S(2015)Gesture recognition using a depth camera for human robot collaboration on assembly line Procedia Manuf 3 518-724