Monocular Gaze Depth Estimation using the Vestibulo-Ocular Reflex

被引:10
作者
Mardanbegi, Diako [1 ]
Clarke, Christopher [1 ]
Gellersen, Hans [1 ]
机构
[1] Univ Lancaster, Lancaster, England
来源
ETRA 2019: 2019 ACM SYMPOSIUM ON EYE TRACKING RESEARCH & APPLICATIONS | 2019年
基金
英国工程与自然科学研究理事会;
关键词
Eye tracking; eye movement; VOR; fixation depth; gaze depth estimation; 3D gaze estimation; TARGET DISTANCE; HEAD ROTATION; GAIN;
D O I
10.1145/3314111.3319822
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Gaze depth estimation presents a challenge for eye tracking in 3D. This work investigates a novel approach to the problem based on eye movement mediated by the vestibulo-ocular reflex (VOR). VOR stabilises gaze on a target during head movement, with eye movement in the opposite direction, and the VOR gain increases the closer the fixated target is to the viewer. We present a theoretical analysis of the relationship between VOR gain and depth which we investigate with empirical data collected in a user study (N=10). We show that VOR gain can be captured using pupil centres, and propose and evaluate a practical method for gaze depth estimation based on a generic function of VOR gain and two-point depth calibration. The results show that VOR gain is comparable with vergence in capturing depth while only requiring one eye, and provide insight into open challenges in harnessing VOR gain as a robust measure.
引用
收藏
页数:9
相关论文
共 33 条
[31]   A REEXAMINATION OF THE GAIN OF THE VESTIBULOOCULAR REFLEX [J].
VIIRRE, E ;
TWEED, D ;
MILNER, K ;
VILIS, T .
JOURNAL OF NEUROPHYSIOLOGY, 1986, 56 (02) :439-450
[32]   Online 3D Gaze Localization on Stereoscopic Displays [J].
Wang, Rui I. ;
Pelfrey, Brandon ;
Duchowski, Andrew T. ;
House, Donald H. .
ACM TRANSACTIONS ON APPLIED PERCEPTION, 2014, 11 (01)
[33]   Predicting the Gaze Depth in Head-mounted Displays using Multiple Feature Regression [J].
Weier, Martin ;
Roth, Thorsten ;
Hinkenjann, Andre ;
Slusallek, Philipp .
2018 ACM SYMPOSIUM ON EYE TRACKING RESEARCH & APPLICATIONS (ETRA 2018), 2018,