EHTask: Recognizing User Tasks From Eye and Head Movements in Immersive Virtual Reality

被引:16
作者
Hu, Zhiming [1 ]
Bulling, Andreas [2 ]
Li, Sheng [1 ,3 ]
Wang, Guoping [1 ,3 ]
机构
[1] Peking Univ, Sch Comp Sci, Beijing 100871, Peoples R China
[2] Univ Stuttgart, D-70174 Stuttgart, Germany
[3] Peking Univ, Natl Biomed Imaging Ctr, Beijing 100871, Peoples R China
基金
欧洲研究理事会; 中国国家自然科学基金; 国家重点研发计划;
关键词
Task analysis; Videos; Head; Visualization; Virtual reality; Magnetic heads; Solid modeling; Visual attention; task recognition; eye movements; head movements; deep learning; virtual reality; GAZE PREDICTION;
D O I
10.1109/TVCG.2021.3138902
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
Understanding human visual attention in immersive virtual reality (VR) is crucial for many important applications, including gaze prediction, gaze guidance, and gaze-contingent rendering. However, previous works on visual attention analysis typically only explored one specific VR task and paid less attention to the differences between different tasks. Moreover, existing task recognition methods typically focused on 2D viewing conditions and only explored the effectiveness of human eye movements. We first collect eye and head movements of 30 participants performing four tasks, i.e., Free viewing, Visual search, Saliency, and Track, in 15 360-degree VR videos. Using this dataset, we analyze the patterns of human eye and head movements and reveal significant differences across different tasks in terms of fixation duration, saccade amplitude, head rotation velocity, and eye-head coordination. We then propose EHTask- a novel learning-based method that employs eye and head movements to recognize user tasks in VR. We show that our method significantly outperforms the state-of-the-art methods derived from 2D viewing conditions both on our dataset (accuracy of 84.4% versus 62.8%) and on a real-world dataset (61.9% versus 44.1%). As such, our work provides meaningful insights into human visual attention under different VR tasks and guides future work on recognizing user tasks in VR.
引用
收藏
页码:1992 / 2004
页数:13
相关论文
共 50 条
  • [31] Group Decision-Making in Multi-User Immersive Virtual Reality
    Moser, Ivan
    Chiquet, Sandra
    Strahm, Sebastian K.
    Mast, Fred W.
    Bergamin, Per
    CYBERPSYCHOLOGY BEHAVIOR AND SOCIAL NETWORKING, 2020, 23 (12) : 846 - 853
  • [32] Virtual Reality Conferencing: Multi-user immersive VR experiences on the web
    Gunkel, Simon N. B.
    Stokking, Hans M.
    Prins, Martin J.
    van der Stap, Nanda
    ter Haar, Frank B.
    Niamut, Omar A.
    PROCEEDINGS OF THE 9TH ACM MULTIMEDIA SYSTEMS CONFERENCE (MMSYS'18), 2018, : 498 - 501
  • [33] A Framework for Developing Multi-user Immersive Virtual Reality Learning Environments
    Checa, David
    Rodriguez-Garcia, Bruno
    Guillen-Sanz, Henar
    Miguel-Alonso, Ines
    EXTENDED REALITY, XR SALENTO 2023, PT I, 2023, 14218 : 89 - 103
  • [34] Scaffolding in immersive virtual reality environments for learning English: an eye tracking study
    Bacca-Acosta, Jorge
    Tejada, Julian
    Fabregat, Ramon
    Kinshuk
    Guevara, Juan
    ETR&D-EDUCATIONAL TECHNOLOGY RESEARCH AND DEVELOPMENT, 2022, 70 (01): : 339 - 362
  • [35] Scaffolding in immersive virtual reality environments for learning English: an eye tracking study
    Jorge Bacca-Acosta
    Julian Tejada
    Ramon Fabregat
    Juan Kinshuk
    Educational technology research and development, 2022, 70 : 339 - 362
  • [36] Assessing Creativity and User Experience in Immersive Virtual Reality with Cultural Heritage Learning
    Tang, Haiqing
    Zhang, Chi
    Li, Qiang
    INTERNATIONAL JOURNAL OF HUMAN-COMPUTER INTERACTION, 2024,
  • [37] A Case for Studying Naturalistic Eye and Head Movements in Virtual Environments
    Callahan-Flintoft, Chloe
    Barentine, Christian
    Touryan, Jonathan
    Ries, Anthony J.
    FRONTIERS IN PSYCHOLOGY, 2021, 12
  • [38] User Cohabitation in Multi-stereoscopic Immersive Virtual Environment for Individual Navigation Tasks
    Chen, Weiya
    Ladeveze, Nicolas
    Clave, Celine
    Mestre, Daniel
    Bourdot, Patrick
    2015 IEEE VIRTUAL REALITY CONFERENCE (VR), 2015, : 47 - 54
  • [39] Gaze Estimation Based on Head Movements in Virtual Reality Applications using Deep Learning
    Soccini, Agata Marta
    2017 IEEE VIRTUAL REALITY (VR), 2017, : 413 - 414
  • [40] The Effect of Motion Direction and Eccentricity on Vection, VR Sickness and Head Movements in Virtual Reality
    Pohlmann, Katharina Margareta Theresa
    Focker, Julia
    Dickinson, Patrick
    Parke, Adrian
    O'Hare, Louise
    MULTISENSORY RESEARCH, 2021, 34 (06) : 623 - 662