A novel adaptive visualization method based on user intention in AR manual assembly

被引:4
作者
Yan, Yuxiang [1 ]
Bai, Xiaoliang [1 ]
He, Weiping [1 ]
Wang, Shuxia [1 ]
Zhang, Xiangyu [1 ]
Wang, Peng [2 ]
Liu, Liwei [1 ]
Yu, Qing [1 ]
机构
[1] Northwestern Polytech Univ, Cyber Phys Interact Lab, Xian 710072, Peoples R China
[2] Chongqing Univ Posts & Telecommun, Sch Adv Mfg Engn, Chongqing 400065, Peoples R China
关键词
User intention; Eye gaze; Spatial location; Adaptive; Visualization; Augmented reality; AUGMENTED REALITY AR; DESIGN;
D O I
10.1007/s00170-023-12557-w
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
AR assembly instruction is a method that helps users with manual assembly by overlaying virtual information on the real environment. However, users have to manually change the information display form, which fails to adapt to their varying information needs in different stages of the assembly task. This study proposes a user-centered adaptive visualization method for AR manual assembly that can provide the necessary information according to the user's intention in the assembly process. This paper develops a system (UIAVS) through an information hierarchy mechanism and a user intention recognition method based on the user's eye gaze, the spatial location, and current assembly task, which can adaptively adjust the visibility of information, how much information is displayed, and the change of visualization form. UIAVS was tested for the first time in a small engine assembly task. Twenty-four participants were randomly assigned to different task familiarity (novice and expert) and different information visualization response methods for assembly task experiments. The results show that UIAVS has better task performance and cognitive performance as well as user experience than traditional AR assembly instruction method, especially for experts. The research results have certain guiding significance for the design of AR assembly instructions, which extends the application of AR technology in practical complex assembly tasks.
引用
收藏
页码:4705 / 4730
页数:26
相关论文
共 65 条
[1]   EXPERT NOVICE DIFFERENCES IN AN APPLIED SELECTIVE ATTENTION TASK [J].
ABERNETHY, B ;
RUSSELL, DG .
JOURNAL OF SPORT PSYCHOLOGY, 1987, 9 (04) :326-345
[2]   Human Activity Analysis: A Review [J].
Aggarwal, J. K. ;
Ryoo, M. S. .
ACM COMPUTING SURVEYS, 2011, 43 (03)
[3]   A Comparison of Head Pose and Deictic Pointing Interaction Methods for Smart Environments [J].
Bernardos, Ana M. ;
Gomez, David ;
Casar, Jose R. .
INTERNATIONAL JOURNAL OF HUMAN-COMPUTER INTERACTION, 2016, 32 (04) :325-351
[4]   Comparing Conventional and Augmented Reality Instructions for Manual Assembly Tasks [J].
Blattgerste, Jonas ;
Strenge, Benjamin ;
Renner, Patrick ;
Pfeiffer, Thies ;
Essig, Kai .
10TH ACM INTERNATIONAL CONFERENCE ON PERVASIVE TECHNOLOGIES RELATED TO ASSISTIVE ENVIRONMENTS (PETRA 2017), 2017, :75-82
[5]  
Caudell T.P., 1992, Proceedings of the Twenty-Fifth Hawaii International Conference on System Sciences, V2, P659, DOI DOI 10.1109/HICSS.1992.183317
[6]  
Chanquoy L., 2007, La charge cognitive: Theorie et applications [Cognitive load: Theory and applications], P131
[7]   Augmented reality in smart manufacturing: Enabling collaboration between humans and artificial intelligence [J].
Chu, Chih-Hsing ;
Wang, Lihui ;
Liu, Shengjun ;
Zhang, Yunbo ;
Menozzi, Marino .
JOURNAL OF MANUFACTURING SYSTEMS, 2021, 61 :658-659
[8]   An experimental study on augmented reality assisted manual assembly with occluded components [J].
Chu, Chih-Hsing ;
Ko, Ching-Hung .
JOURNAL OF MANUFACTURING SYSTEMS, 2021, 61 :685-695
[9]  
Dalle Mura M., 2021, Revised Selected Papers, P3
[10]   Level of detail interfaces [J].
DiVerdi, S ;
Höllerer, T ;
Schreyer, R .
ISMAR 2004: THIRD IEEE AND ACM INTERNATIONAL SYMPOSIUM ON MIXED AND AUGMENTED REALITY, 2004, :300-301