A gaze-driven manufacturing assembly assistant system with integrated step recognition, repetition analysis, and real-time feedback

被引:5
作者
Chen, Haodong [1 ]
Zendehdel, Niloofar [2 ]
Leu, Ming C. [2 ]
Yin, Zhaozheng [3 ,4 ]
机构
[1] Univ Maryland, Dept Mech Engn, College Pk, MD 20742 USA
[2] Missouri Univ Sci & Technol, Dept Mech & Aerosp Engn, Rolla, MO USA
[3] SUNY Stony Brook, Dept Biomed Informat, Stony Brook, NY USA
[4] SUNY Stony Brook, Dept Comp Sci, Stony Brook, NY USA
基金
美国国家科学基金会;
关键词
Assembly assistance; Eye gaze estimation; Repetitive action counting; Transformer; Implemented artificial intelligence; Application of artificial intelligence; EYE-MOVEMENTS;
D O I
10.1016/j.engappai.2025.110076
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Modern manufacturing faces significant challenges, including efficiency bottlenecks and high error rates in manual assembly operations. To address these challenges, we implement artificial intelligence (AI) and propose a gaze-driven assembly assistant system that leverages artificial intelligence for human-centered smart manufacturing. Our system processes video inputs of assembly activities using a Convolutional Neural Network (CNN) and Long Short-Term Memory (LSTM) network for assembly step recognition, a Transformer network for repetitive action counting, and a gaze tracker for eye gaze estimation. The application of AI integrates the outputs of these tasks to deliver real-time visual assistance through a software interface that displays relevant tools, parts, and procedural instructions based on recognized steps and gaze data. Experimental results demonstrate the system's high performance, achieving 98.36% accuracy in assembly step recognition, a mean absolute error (MAE) of 4.37%, and an off-by-one accuracy (OBOA) of 95.88% inaction counting. Compared to existing solutions, our gaze-driven assistant offers superior precision and efficiency, providing a scalable and adaptable framework suitable for complex and large-scale manufacturing environments.
引用
收藏
页数:16
相关论文
共 63 条
[1]   Assessment of virtual reality-based manufacturing assembly training system [J].
Abidi, Mustufa Haider ;
Al-Ahmari, Abdulrahman ;
Ahmad, Ali ;
Ameen, Wadea ;
Alkhalefah, Hisham .
INTERNATIONAL JOURNAL OF ADVANCED MANUFACTURING TECHNOLOGY, 2019, 105 (09) :3743-3759
[2]   Fusing Hand and Body Skeletons for Human Action Recognition in Assembly [J].
Aganian, Dustin ;
Koehler, Mona ;
Stephan, Benedict ;
Eisenbach, Markus ;
Gross, Horst-Michael .
ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2023, PT I, 2023, 14254 :207-219
[3]   Convolutional Neural Network-Based Methods for Eye Gaze Estimation: A Survey [J].
Akinyelu, Andronicus A. ;
Blignaut, Pieter .
IEEE ACCESS, 2020, 8 :142581-142605
[4]   Action Recognition in Manufacturing Assembly using Multimodal Sensor Fusion [J].
Al-Amin, Md. ;
Tao, Wenjin ;
Doell, David ;
Lingard, Ravon ;
Yin, Zhaozheng ;
Leu, Ming C. ;
Qin, Ruwen .
25TH INTERNATIONAL CONFERENCE ON PRODUCTION RESEARCH MANUFACTURING INNOVATION: CYBER PHYSICAL MANUFACTURING, 2019, 39 :158-167
[5]  
[Anonymous], 2023, About Us
[6]  
[Anonymous], _____. European Commission, n. d. Available at: lt
[7]  
https://ec.europa.eu/info/business-economy-euro/euro-area/introducing-euro/adoption-fixed-euroconversion-rate/erm-ii-eus-exchange-rate-mechanism_en'.
[8]  
Bazarevsky V., 2020, Blazepose: On-device real-time body pose tracking, DOI DOI 10.48550/ARXIV.2006.10204
[9]  
Berger M., 2023, Proving the potential of skeleton based action recognition to automate the analysis of manual processes
[10]   Observing Pictures and Videos of Creative Products: An Eye Tracking Study [J].
Berni, Aurora ;
Maccioni, Lorenzo ;
Borgianni, Yuri .
APPLIED SCIENCES-BASEL, 2020, 10 (04)