AR/MR Remote Collaboration on Physical Tasks: A Review

被引:93
作者
Wang, Peng [1 ]
Bai, Xiaoliang [1 ]
Billinghurst, Mark [1 ,2 ]
Zhang, Shusheng [1 ]
Zhang, Xiangyu [1 ]
Wang, Shuxia [1 ]
He, Weiping [1 ]
Yan, Yuxiang [1 ]
Ji, Hongyu [1 ]
机构
[1] Northwestern Polytech Univ, Cyber Phys Interact Lab, Xian, Peoples R China
[2] Univ South Australia, Empath Comp Lab, Mawson Lakes, Australia
基金
国家重点研发计划;
关键词
Augmented Reality; Mixed Reality; Remote collaboration; Computer-supported collaborative work; Human-computer interaction; Physical tasks; AUGMENTED REALITY APPLICATIONS; VISUAL INFORMATION; VIEW INDEPENDENCE; SYSTEM; VIDEO; MAINTENANCE; AR; DESIGN; GAZE; AWARENESS;
D O I
10.1016/j.rcim.2020.102071
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
This paper provides a review of research into using Augmented Reality (AR) and Mixed Reality(MR) for remote collaboration on physical tasks. AR/MR-based remote collaboration on physical tasks has recently become more prominent in academic research and engineering applications. It has great potential in many fields, such as realtime remote medical consultation, education, training, maintenance, remote assistance in engineering, and other remote collaborative tasks. However, to the best of our knowledge there has not been any comprehensive review of research in AR/MR remote collaboration on physical tasks. Therefore, this paper presents a comprehensive survey of research between 2000 and 2018 in this domain. We collected 215 papers, more than 80% of which were published between 2010 and 2018, and all relevant works are discussed at length. Then we elaborate on the review from typical architectures, applications (e.g., industry, telemedicine, architecture, teleducation and others), and empathic computing. Next, we made an in-depth review of the papers from seven aspects: (1) collection and classification research, (2) using 3D scene reconstruction environments and live panorama, (3) periodicals and conducting research, (4) local and remote user interfaces, (5) features of user interfaces commonly used, (6) architecture and sharing non-verbal cues, (7) applications and toolkits. We find that most papers (160 articles, 74.4%) are published in conferences, using co-located collaboration to emulate remote collaboration is adopted by more than half (126, 58.6%) of the reviewed papers, the shared non-verbal cues can be mainly classified into five types (Virtual Replicas or Physical Proxy(VRP), AR Annotations or a Cursor Pointer (ARACP), avatar, gesture, and gaze), the local/remote interface is mainly divided into four categories (Head Mounted Displays(HMD), Spatial Augmented Reality(SAR), Windows-Icon-Menu-Pointer(WIMP) and Hand-Held Displays(HHD)). From this, we can draw ten conclusions. Following this we report on issues for future works. The paper also provides an overall academic roadmap and useful insight into the state-of-the-art of AR/MR remote collaboration on physical tasks. This work will be useful for current and future researchers who are interested in collaborative AR/MR systems.
引用
收藏
页数:32
相关论文
共 268 条
[11]   Virtual annotations of the surgical field through an augmented reality transparent display [J].
Andersen, Daniel ;
Popescu, Voicu ;
Cabrera, Maria Eugenia ;
Shanghavi, Aditya ;
Gomez, Gerardo ;
Marley, Sherri ;
Mullis, Brian ;
Wachs, Juan .
VISUAL COMPUTER, 2016, 32 (11) :1481-1498
[12]   Medical telementoring using an augmented reality transparent display [J].
Andersen, Daniel ;
Popescu, Voicu ;
Cabrera, Maria Eugenia ;
Shanghavi, Aditya ;
Gomez, Gerardo ;
Marley, Sherri ;
Mullis, Brian ;
Wachs, Juan P. .
SURGERY, 2016, 159 (06) :1646-1653
[13]  
Andrew M., 2011, International Conference on Artificial reality and telexistance, P4
[14]   Looking Coordinated: Bidirectional Gaze Mechanisms for Collaborative Interaction with Virtual Characters [J].
Andrist, Sean ;
Gleicher, Michael ;
Mutlu, Bilge .
PROCEEDINGS OF THE 2017 ACM SIGCHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS (CHI'17), 2017, :2571-2582
[15]  
[Anonymous], 2014, TOUCH REMOTE WORLD R, DOI [10.1145/2671015.2671016, DOI 10.1145/2671015.2671016]
[16]  
[Anonymous], 2015, P INT C INT TABL SUR
[17]  
[Anonymous], 2014, SIGGRAPH ASIA 2014 M
[18]  
[Anonymous], 2013, P 4 AUGM HUM INT C, DOI DOI 10.1145/2459236.2459249
[19]  
[Anonymous], 2011, P 23 AUSTR COMP HUM
[20]  
Anton D., 2017, FUTURE GENER COMP SY, V82