Deep learning models for webcam eye tracking in online experiments

被引:7
作者
Saxena, Shreshth [1 ,2 ]
Fink, Lauren K. [1 ,2 ,3 ]
Lange, Elke B. [1 ]
机构
[1] Max Planck Inst Empir Aesthet, Mus Depart, Frankfurt, Germany
[2] McMaster Univ, Dept Psychol Neurosci & Behav, Hamilton, ON, Canada
[3] Max Planck NYU Ctr Language Mus & Emot, Frankfurt, Germany
关键词
Online; Low resolution; Eye tracking; Deep learning; Computer vision; Eye gaze; Fixation; Free viewing; Smooth pursuit; Blinks;
D O I
10.3758/s13428-023-02190-6
中图分类号
B841 [心理学研究方法];
学科分类号
040201 ;
摘要
Eye tracking is prevalent in scientific and commercial applications. Recent computer vision and deep learning methods enable eye tracking with off-the-shelf webcams and reduce dependence on expensive, restrictive hardware. However, such deep learning methods have not yet been applied and evaluated for remote, online psychological experiments. In this study, we tackle critical challenges faced in remote eye tracking setups and systematically evaluate appearance-based deep learning methods of gaze tracking and blink detection. From their own homes and laptops, 65 participants performed a battery of eye tracking tasks including (i) fixation, (ii) zone classification, (iii) free viewing, (iv) smooth pursuit, and (v) blink detection. Webcam recordings of the participants performing these tasks were processed offline through appearance-based models of gaze and blink detection. The task battery required different eye movements that characterized gaze and blink prediction accuracy over a comprehensive list of measures. We find the best gaze accuracy to be 2.4 & DEG; and precision of 0.47 & DEG;, which outperforms previous online eye tracking studies and reduces the gap between laboratory-based and online eye tracking performance. We release the experiment template, recorded data, and analysis code with the motivation to escalate affordable, accessible, and scalable eye tracking that has the potential to accelerate research in the fields of psychological science, cognitive neuroscience, user experience design, and human-computer interfaces.
引用
收藏
页码:3487 / 3503
页数:17
相关论文
共 47 条
  • [41] Accelerating eye movement research via accurate and affordable smartphone eye tracking
    Valliappan, Nachiappan
    Dai, Na
    Steinberg, Ethan
    He, Junfeng
    Rogers, Kantwon
    Ramachandran, Venky
    Xu, Pingmei
    Shojaeizadeh, Mina
    Guo, Li
    Kohlhoff, Kai
    Navalpakkam, Vidhya
    [J]. NATURE COMMUNICATIONS, 2020, 11 (01)
  • [42] Xu P., 2015, arXiv
  • [43] Xucong Zhang, 2020, Computer Vision - ECCV 2020 16th European Conference. Proceedings. Lecture Notes in Computer Science (LNCS 12350), P365, DOI 10.1007/978-3-030-58558-7_22
  • [44] Yang XZ, 2021, JUDGM DECIS MAK, V16, P1485
  • [45] YARBUS A L., 1967, Eye Movements and Vision, VVII, P171, DOI [DOI 10.1007/978-1-4899-5379-7, 10.1007/978-1-4899-5379-7, DOI 10.1007/978-1-4899-5379-7_8]
  • [46] Evaluation of Appearance-Based Methods and Implications for Gaze-Based Applications
    Zhang, Xucong
    Sugano, Yusuke
    Bulling, Andreas
    [J]. CHI 2019: PROCEEDINGS OF THE 2019 CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS, 2019,
  • [47] Zhang XC, 2015, PROC CVPR IEEE, P4511, DOI 10.1109/CVPR.2015.7299081