EHTask: Recognizing User Tasks From Eye and Head Movements in Immersive Virtual Reality

被引:16
作者
Hu, Zhiming [1 ]
Bulling, Andreas [2 ]
Li, Sheng [1 ,3 ]
Wang, Guoping [1 ,3 ]
机构
[1] Peking Univ, Sch Comp Sci, Beijing 100871, Peoples R China
[2] Univ Stuttgart, D-70174 Stuttgart, Germany
[3] Peking Univ, Natl Biomed Imaging Ctr, Beijing 100871, Peoples R China
基金
欧洲研究理事会; 中国国家自然科学基金; 国家重点研发计划;
关键词
Task analysis; Videos; Head; Visualization; Virtual reality; Magnetic heads; Solid modeling; Visual attention; task recognition; eye movements; head movements; deep learning; virtual reality; GAZE PREDICTION;
D O I
10.1109/TVCG.2021.3138902
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
Understanding human visual attention in immersive virtual reality (VR) is crucial for many important applications, including gaze prediction, gaze guidance, and gaze-contingent rendering. However, previous works on visual attention analysis typically only explored one specific VR task and paid less attention to the differences between different tasks. Moreover, existing task recognition methods typically focused on 2D viewing conditions and only explored the effectiveness of human eye movements. We first collect eye and head movements of 30 participants performing four tasks, i.e., Free viewing, Visual search, Saliency, and Track, in 15 360-degree VR videos. Using this dataset, we analyze the patterns of human eye and head movements and reveal significant differences across different tasks in terms of fixation duration, saccade amplitude, head rotation velocity, and eye-head coordination. We then propose EHTask- a novel learning-based method that employs eye and head movements to recognize user tasks in VR. We show that our method significantly outperforms the state-of-the-art methods derived from 2D viewing conditions both on our dataset (accuracy of 84.4% versus 62.8%) and on a real-world dataset (61.9% versus 44.1%). As such, our work provides meaningful insights into human visual attention under different VR tasks and guides future work on recognizing user tasks in VR.
引用
收藏
页码:1992 / 2004
页数:13
相关论文
共 50 条
  • [1] Eye Movements and Vestibulo-Ocular Reflex as User Response in Virtual Reality
    Palomino-Roldan, Geovanny
    Rojas-Cessa, Roberto
    Suaste-Gomez, Ernesto
    IEEE ACCESS, 2023, 11 : 36856 - 36864
  • [2] A tutorial: Analyzing eye and head movements in virtual reality
    Bischof, Walter F.
    Anderson, Nicola C.
    Kingstone, Alan
    BEHAVIOR RESEARCH METHODS, 2024, 56 (08) : 8396 - 8421
  • [3] Eye movements to absent objects during mental imagery and visual memory in immersive virtual reality
    Chiquet, Sandra
    Martarelli, Corinna S.
    Mast, Fred W.
    VIRTUAL REALITY, 2021, 25 (03) : 655 - 667
  • [4] Eye movements to absent objects during mental imagery and visual memory in immersive virtual reality
    Sandra Chiquet
    Corinna S. Martarelli
    Fred W. Mast
    Virtual Reality, 2021, 25 : 655 - 667
  • [5] Indicators of Training Success in Virtual Reality Using Head and Eye Movements
    Gisler, Joy
    Schneider, Johannes
    Handali, Joshua
    Holzwarth, Valentin
    Hirt, Christian
    Fuhl, Wolfgang
    vom Brocke, Jan
    Kunz, Andreas
    2021 IEEE INTERNATIONAL SYMPOSIUM ON MIXED AND AUGMENTED REALITY ADJUNCT PROCEEDINGS (ISMAR-ADJUNCT 2021), 2021, : 280 - 285
  • [6] Vergence eye movements in virtual reality
    McAnally, Ken
    Grove, Philip
    Wallis, Guy
    DISPLAYS, 2024, 83
  • [7] Exploring an Immersive User Interface in Virtual Reality Storytelling
    Seo, Gapyuel
    HCI INTERNATIONAL 2021 - LATE BREAKING POSTERS, HCII 2021, PT I, 2021, 1498 : 385 - 389
  • [8] Characteristics of user, tasks and interfaces in virtual reality
    Tyndiuk, F.
    Lespinet-Najib, V.
    Thomas, G.
    N'Kaoua, B.
    Schlick, C.
    Claverie, B.
    EUROPEAN REVIEW OF APPLIED PSYCHOLOGY-REVUE EUROPEENNE DE PSYCHOLOGIE APPLIQUEE, 2007, 57 (04): : 225 - 236
  • [9] Comparing Eye Tracking and Head Tracking During a Visual Attention Task in Immersive Virtual Reality
    Llanes-Jurado, Jose
    Marin-Morales, Javier
    Moghaddasi, Masoud
    Khatri, Jaikishan
    Guixeres, Jaime
    Alcaniz, Mariano
    HUMAN-COMPUTER INTERACTION: INTERACTION TECHNIQUES AND NOVEL APPLICATIONS, HCII 2021, PT II, 2021, 12763 : 32 - 43
  • [10] Assessing Saccadic Eye Movements With Head-Mounted Display Virtual Reality Technology
    Imaoka, Yu
    Flury, Andri
    de Bruin, Eling D.
    FRONTIERS IN PSYCHIATRY, 2020, 11