Multi-sensor fusion based industrial action recognition method under the environment of intelligent manufacturing

被引:7
作者
Wang Z. [1 ]
Yan J. [1 ]
机构
[1] School of Mechatronics Engineering, Harbin Institute of Technology, Harbin
关键词
Action recognition; D-S evidence theory; Industry; 4.0; Intelligent manufacturing; Multi-sensor fusion;
D O I
10.1016/j.jmsy.2024.04.019
中图分类号
学科分类号
摘要
In the context of intelligent manufacturing and Industry 4.0, the manufacturing industry is rapidly transitioning toward mass personalization production. Despite this trend, the assembly industry still relies on manual operations performed by workers, considering their cognitive ability and flexibility. Thereinto, studying operator action perception and recognition methods is a vital filed and of great significance for improving the production efficiency and ensuring product quality. In this paper, a multi-sensor fusion-based data acquisition system is constructed to address the challenge of achieving comprehensive and accurate perception of the assembly process with a single sensor. Then, an action recognition model architecture based on ResNet + LSTM + D-S evidence theory is proposed and established. By fully considering the characteristics of different data, the multi-sensor data values are maximized, data complementarity is achieved, and the recognition accuracy exceeds 97 %. This research is expected to provide guidance for increasing the degree of workshop automation and improving the efficiency and quality of the production process. © 2024
引用
收藏
页码:575 / 586
页数:11
相关论文
共 64 条
[1]  
Muller R., Esser M., Vette M., Reconfigurable handling systems as an enabler for large components in mass customized production, J Intell Manuf, 24, 5, pp. 977-990, (2013)
[2]  
Vysocky A., Novak P., Human - robot collaboration in industry, MM Sci J, pp. 903-906, (2016)
[3]  
Aehnelt M., Gutzeit E., Urban B., Using activity recognition for the tracking of assembly processes: challenges and requirements, Proc Workshop Sens-Based Act Recognit, (2014)
[4]  
Gladysz B., Tran T.-A., Romero D., van Erp T., Janos Abonyi, Tamas Ruppert, Current development on the Operator 4.0 and transition towards the Operator 5.0: a systematic literature review in light of Industry 5.0, J Manuf Syst, 70, pp. 160-185, (2023)
[5]  
Wang Z., Qin R., Yan J., Guo C., Vision sensor based action recognition for improving efficiency and quality under the environment of Industry 4.0, Procedia CIRP, 80, pp. 711-716, (2019)
[6]  
Tao W., Lai Z.-H., Leu M.C., Yin Z., Worker activity recognition in smart manufacturing using IMU and sEMG signals with convolutional neural networks, Procedia Manuf, 26, pp. 1159-1166, (2018)
[7]  
Al-Amin M., Qin R., Tao W., Leu M.C., Sensor data based models for workforce management in smart manufacturing, In: Proceedings of the 2018 industrial and systems engineering research conference (ISERC'18), pp. 481-486, (2018)
[8]  
Al-Amin M., Tao W., Doell D., Lingard R., Yin Z., Leu M.C., Et al., Action recognition in manufacturing assembly using multimodal sensor fusion, Procedia Manuf, 39, pp. 158-167, (2019)
[9]  
Chen C., Jafari R., Kehtarnavaz N., A survey of depth and inertial sensor fusion for human action recognition, Multimed Tools Appl, 76, 3, pp. 4405-4425, (2017)
[10]  
Guo M., Wang Z., Yang N., Li Z., An T., A multisensor multiclassifier hierarchical fusion model based on entropy weight for human activity recognition using wearable inertial sensors, IEEE Trans Hum-Mach Syst, 49, 1, pp. 105-111, (2019)