EHTask: Recognizing User Tasks From Eye and Head Movements in Immersive Virtual Reality

被引:16
作者
Hu, Zhiming [1 ]
Bulling, Andreas [2 ]
Li, Sheng [1 ,3 ]
Wang, Guoping [1 ,3 ]
机构
[1] Peking Univ, Sch Comp Sci, Beijing 100871, Peoples R China
[2] Univ Stuttgart, D-70174 Stuttgart, Germany
[3] Peking Univ, Natl Biomed Imaging Ctr, Beijing 100871, Peoples R China
基金
欧洲研究理事会; 中国国家自然科学基金; 国家重点研发计划;
关键词
Task analysis; Videos; Head; Visualization; Virtual reality; Magnetic heads; Solid modeling; Visual attention; task recognition; eye movements; head movements; deep learning; virtual reality; GAZE PREDICTION;
D O I
10.1109/TVCG.2021.3138902
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
Understanding human visual attention in immersive virtual reality (VR) is crucial for many important applications, including gaze prediction, gaze guidance, and gaze-contingent rendering. However, previous works on visual attention analysis typically only explored one specific VR task and paid less attention to the differences between different tasks. Moreover, existing task recognition methods typically focused on 2D viewing conditions and only explored the effectiveness of human eye movements. We first collect eye and head movements of 30 participants performing four tasks, i.e., Free viewing, Visual search, Saliency, and Track, in 15 360-degree VR videos. Using this dataset, we analyze the patterns of human eye and head movements and reveal significant differences across different tasks in terms of fixation duration, saccade amplitude, head rotation velocity, and eye-head coordination. We then propose EHTask- a novel learning-based method that employs eye and head movements to recognize user tasks in VR. We show that our method significantly outperforms the state-of-the-art methods derived from 2D viewing conditions both on our dataset (accuracy of 84.4% versus 62.8%) and on a real-world dataset (61.9% versus 44.1%). As such, our work provides meaningful insights into human visual attention under different VR tasks and guides future work on recognizing user tasks in VR.
引用
收藏
页码:1992 / 2004
页数:13
相关论文
共 50 条
  • [21] Assisting Immersive Virtual Reality Development with User Experience Design Approach
    Kauhanen, Otto
    Vaataja, Heli
    Turunen, Markku
    Keskinen, Tuuli
    Sirkkunen, Esa
    Uskali, Turo
    Lindqvist, Vesa
    Kelling, Chelsea
    Karhu, Jussi
    PROCEEDINGS OF THE 21ST INTERNATIONAL ACADEMIC MINDTREK CONFERENCE (ACADEMIC MINDTREK), 2017, : 127 - 136
  • [22] Effect of Collaboration Mode and Position Arrangement on Immersive Analytics Tasks in Virtual Reality: A Pilot Study
    Chen, Lei
    Liang, Hai-Ning
    Lu, Feiyu
    Wang, Jialin
    Chen, Wenjun
    Yue, Yong
    APPLIED SCIENCES-BASEL, 2021, 11 (21):
  • [23] Effects of Gender and Prior Experience in Immersive User Experience with Virtual Reality
    Sagnier, Camille
    Loup-Escande, Emilie
    Vallery, Gerard
    ADVANCES IN USABILITY AND USER EXPERIENCE, 2020, 972 : 305 - 314
  • [24] Prospective on Eye-Tracking-based Studies in Immersive Virtual Reality
    Li, Fan
    Lee, Ching-Hung
    Feng, Shanshan
    Trappey, Amy
    Gilani, Fazal
    PROCEEDINGS OF THE 2021 IEEE 24TH INTERNATIONAL CONFERENCE ON COMPUTER SUPPORTED COOPERATIVE WORK IN DESIGN (CSCWD), 2021, : 861 - 866
  • [25] From Virtual Reality to Immersive Analytics in Bioinformatics
    Sommer, Bjoern
    Baaden, Marc
    Krone, Michael
    Woods, Andrew
    JOURNAL OF INTEGRATIVE BIOINFORMATICS, 2018, 15 (02)
  • [26] Visualization and Interaction in Immersive Virtual Reality Games: a User Evaluation Study
    Rachevsky, Douglas Cavalli
    de Souza, Vinicius Costa
    Nedel, Luciana
    2018 20TH SYMPOSIUM ON VIRTUAL AND AUGMENTED REALITY (SVR 2018), 2018, : 89 - 98
  • [27] Language-driven anticipatory eye movements in virtual reality
    Nicole Eichert
    David Peeters
    Peter Hagoort
    Behavior Research Methods, 2018, 50 : 1102 - 1115
  • [28] BioMove: Biometric User Identification from Human Kinesiological Movements for Virtual Reality Systems
    Olade, Ilesanmi
    Fleming, Charles
    Liang, Hai-Ning
    SENSORS, 2020, 20 (10)
  • [29] Language-driven anticipatory eye movements in virtual reality
    Eichert, Nicole
    Peeters, David
    Hagoort, Peter
    BEHAVIOR RESEARCH METHODS, 2018, 50 (03) : 1102 - 1115
  • [30] Developing a Tutorial for Improving Usability and User Skills in an Immersive Virtual Reality Experience
    Miguel-Alonso, Ines
    Rodriguez-Garcia, Bruno
    Checa, David
    Tommaso De Paolis, Lucio
    EXTENDED REALITY, XR SALENTO 2022, PT II, 2022, 13446 : 63 - 78