Getting (More) Real: Bringing Eye Movement Classification to HMD Experiments with Equirectangular Stimuli

被引:1
作者
Agtzidis, Ioannis [1 ]
Dorr, Michael [1 ]
机构
[1] Tech Univ Munich, Munich, Germany
来源
ETRA 2019: 2019 ACM SYMPOSIUM ON EYE TRACKING RESEARCH & APPLICATIONS | 2019年
关键词
eye movement classification; event detection; 360 degrees content;
D O I
10.1145/3314111.3319829
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The classification of eye movements is a very important part of eye tracking research and has been studied since its early days. Over recent years, we have experienced an increasing shift towards more immersive experimental scenarios with the use of eye-tracking enabled glasses and head-mounted displays. In these new scenarios, however, most of the existing eye movement classification algorithms cannot be applied robustly anymore because they were developed with monitor-based experiments using regular 2D images and videos in mind. In this paper, we describe two approaches that reduce artifacts of eye movement classification for 360 degrees videos shown in head-mounted displays. For the first approach, we discuss how decision criteria have to change in the space of 360 degrees videos, and use these criteria to modify five popular algorithms from the literature. The modified algorithms are publicly available at https://web.gin.g- node.org/ioannis.agtzidis/360_em_algorithms. For cases where an existing algorithm cannot be modified, e.g. because it is closed-source, we present a second approach that maps the data instead of the algorithm to the 360 degrees space. An empirical evaluation of both approaches shows that they significantly reduce the artifacts of the initial algorithm, especially in the areas further from the horizontal midline.
引用
收藏
页数:8
相关论文
共 26 条
[1]   Smooth Pursuit Detection Based on Multiple Observers [J].
Agtzidis, Ioannis ;
Startsev, Mikhail ;
Dorr, Michael .
2016 ACM SYMPOSIUM ON EYE TRACKING RESEARCH & APPLICATIONS (ETRA 2016), 2016, :303-306
[2]  
Agtzidis Ioannis, 2019, arXiv
[3]  
[Anonymous], 2016, ARXIV E PRINTS
[4]  
[Anonymous], 2012, P S EYE TRACK RES AP, DOI DOI 10.1145/2168556.2168599
[5]  
Barz Michael, 2015, PUPIL FIXATION DETEC
[6]   Comparison of Gaze Cursor Input Methods for Virtual Reality Devices [J].
Choe, Mungyeong ;
Choi, Yeongcheol ;
Park, Jaehyun ;
Kim, Hyun K. .
INTERNATIONAL JOURNAL OF HUMAN-COMPUTER INTERACTION, 2019, 35 (07) :620-629
[7]   A Dataset of Head and Eye Movements for 360° Videos [J].
David, Erwan J. ;
Gutierrez, Jesus ;
Coutrot, Antoine ;
Da Silva, Matthieu Perreira ;
Le Callet, Patrick .
PROCEEDINGS OF THE 9TH ACM MULTIMEDIA SYSTEMS CONFERENCE (MMSYS'18), 2018, :432-437
[8]   Real-time recording and classification of eye movements in an immersive virtual environment [J].
Diaz, Gabriel ;
Cooper, Joseph ;
Kit, Dmitry ;
Hayhoe, Mary .
JOURNAL OF VISION, 2013, 13 (12)
[9]   Variability of eye movements when viewing dynamic natural scenes [J].
Dorr, Michael ;
Martinetz, Thomas ;
Gegenfurtner, Karl R. ;
Barth, Erhardt .
JOURNAL OF VISION, 2010, 10 (10)
[10]  
Duchowski A. T., 2002, Proceedings ETRA 2002. Eye Tracking Research and Applications Symposium, P103, DOI 10.1145/507072.507094