Smart Task Assistance Through Deep Learning-Based Visual Guidance in Asymmetric XR Remote Collaboration

被引:2
作者
Moon, Hongju [1 ]
Yu, Seunghyeon [2 ]
Lee, Jae Yeol [2 ]
机构
[1] Gwangju Technopk, Gwangju 61008, South Korea
[2] Chonnam Natl Univ, Dept Ind Engn, Gwangju 61186, South Korea
基金
新加坡国家研究基金会;
关键词
Collaboration; Visualization; Three-dimensional displays; Extended reality; Deep learning; Solid modeling; Sensors; Task analysis; augmented reality; virtual reality; asymmetric remote collaboration; deep learning-based visual guidance; smart task assistance; REALITY;
D O I
10.1109/ACCESS.2024.3455014
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In the context of the emerging contactless era, many previous studies have been paid attention to remote collaboration. Although remote collaboration offers the distinct advantage of enabling cooperative initiatives independent of geographical limitations, the effectiveness of communication and the depth of shared understanding are limited compared to collaborative activities conducted in a face-to-face context. These limitations can be overcome through the utilization of extended reality (XR), encompassing both augmented reality (AR) and virtual reality (VR). Although there are previous studies for integrating AR and VR for asymmetric collaboration, supporting visual guidance for effective task assistance remains still challenging. This study proposes an asymmetric XR-based remote collaboration approach to supporting smart task assistance by reconstructing the 3D virtual space of the local AR environment as a digital twin of the real-world spatial reference and by providing deep learning-based visual guidance, in addition to multimodal gestures such as hand gestures and eye gazing. Thus, a remote VR expert can comprehensively understand and explore the local working situation and interact with the remote worker with various interaction metaphors. Thus, the VR expert can guide the remote AR worker to perform their tasks more effectively through the step-by-step instructions with deep learning-based visual cues and annotations. A user study was conducted to explore the advantages of deep learning-based visual guidance for task assistance for asymmetric XR remote collaboration. The results showed that collaborating with deep learning-based visual guidance improved task execution time and some criteria concerning usability and workload. In addition, social presence was higher when eye gazing was provided. The findings can help design better XR-enabled remote collaboration and provide new directions for future research.
引用
收藏
页码:126899 / 126914
页数:16
相关论文
共 50 条
[1]   From BIM to extended reality in AEC industry [J].
Alizadehsalehi, Sepehr ;
Hadavi, Ahmad ;
Huang, Joseph Chuenhuei .
AUTOMATION IN CONSTRUCTION, 2020, 116
[2]  
[Anonymous], 2023, Microsoft Mixed Reality Toolkit
[3]  
[Anonymous], 2023, Meta Quest 2
[4]  
[Anonymous], 2023, Unity 3D
[5]  
[Anonymous], 2023, Microsoft HoloLens 2
[6]  
[Anonymous], 2023, Spatial Anchors
[7]  
Bai P., 2020, P CHI C HUM FACT COM, P1
[8]   An empirical evaluation of the System Usability Scale [J].
Bangor, Aaron ;
Kortum, Philip T. ;
Miller, James T. .
INTERNATIONAL JOURNAL OF HUMAN-COMPUTER INTERACTION, 2008, 24 (06) :574-594
[9]   Asynchronous industrial collaboration: How virtual reality and virtual tools aid the process of maintenance method development and documentation creation [J].
Burova, Alisa ;
Makela, John ;
Heinonen, Hanna ;
Palma, Paulina Becerril ;
Hakulinen, Jaakko ;
Opas, Viveka ;
Siltanen, Sanni ;
Raisamo, Roope ;
Turunen, Markku .
COMPUTERS IN INDUSTRY, 2022, 140
[10]   DAVE: Deep Learning-Based Asymmetric Virtual Environment for Immersive Experiential Metaverse Content [J].
Cho, Yunsik ;
Hong, Seunghyun ;
Kim, Mingyu ;
Kim, Jinmo .
ELECTRONICS, 2022, 11 (16)