Deep learning models for webcam eye tracking in online experiments

被引:7
作者
Saxena, Shreshth [1 ,2 ]
Fink, Lauren K. [1 ,2 ,3 ]
Lange, Elke B. [1 ]
机构
[1] Max Planck Inst Empir Aesthet, Mus Depart, Frankfurt, Germany
[2] McMaster Univ, Dept Psychol Neurosci & Behav, Hamilton, ON, Canada
[3] Max Planck NYU Ctr Language Mus & Emot, Frankfurt, Germany
关键词
Online; Low resolution; Eye tracking; Deep learning; Computer vision; Eye gaze; Fixation; Free viewing; Smooth pursuit; Blinks;
D O I
10.3758/s13428-023-02190-6
中图分类号
B841 [心理学研究方法];
学科分类号
040201 ;
摘要
Eye tracking is prevalent in scientific and commercial applications. Recent computer vision and deep learning methods enable eye tracking with off-the-shelf webcams and reduce dependence on expensive, restrictive hardware. However, such deep learning methods have not yet been applied and evaluated for remote, online psychological experiments. In this study, we tackle critical challenges faced in remote eye tracking setups and systematically evaluate appearance-based deep learning methods of gaze tracking and blink detection. From their own homes and laptops, 65 participants performed a battery of eye tracking tasks including (i) fixation, (ii) zone classification, (iii) free viewing, (iv) smooth pursuit, and (v) blink detection. Webcam recordings of the participants performing these tasks were processed offline through appearance-based models of gaze and blink detection. The task battery required different eye movements that characterized gaze and blink prediction accuracy over a comprehensive list of measures. We find the best gaze accuracy to be 2.4 & DEG; and precision of 0.47 & DEG;, which outperforms previous online eye tracking studies and reduces the gap between laboratory-based and online eye tracking performance. We release the experiment template, recorded data, and analysis code with the motivation to escalate affordable, accessible, and scalable eye tracking that has the potential to accelerate research in the fields of psychological science, cognitive neuroscience, user experience design, and human-computer interfaces.
引用
收藏
页码:3487 / 3503
页数:17
相关论文
共 47 条
  • [31] Salvucci DD, 2000, P 2000 S EYE TRACK R, P71, DOI DOI 10.1145/355017.355028
  • [32] Classifying Emotions and Engagement in Online Learning Based on a Single Facial Expression Recognition Neural Network
    Savchenko, Andrey V. V.
    Savchenko, Lyudmila V. V.
    Makarov, Ilya
    [J]. IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, 2022, 13 (04) : 2132 - 2143
  • [33] Saxena S., 2021, OSF REPOSITORY, DOI [10.17605/OSF.IO/QH8KX, DOI 10.17605/OSF.IO/QH8KX]
  • [34] Towards efficient calibration for webcam eye-tracking in online experiments
    Saxena, Shreshth
    Lange, Elke B.
    Fink, Lauren K.
    [J]. 2022 ACM SYMPOSIUM ON EYE TRACKING RESEARCH AND APPLICATIONS, ETRA 2022, 2022,
  • [35] Social anxiety and difficulty disengaging threat: Evidence from eye-tracking
    Schofield, Casey A.
    Johnson, Ashley L.
    Inhoff, Albrecht W.
    Coles, Meredith E.
    [J]. COGNITION & EMOTION, 2012, 26 (02) : 300 - 311
  • [36] Online webcam-based eye tracking in cognitive science: A first look
    Semmelmann, Kilian
    Weigelt, Sarah
    [J]. BEHAVIOR RESEARCH METHODS, 2018, 50 (02) : 451 - 465
  • [37] The Incomplete Fixation Measure
    Shic, Frederick
    Chawarska, Katarzyna
    Scassellati, Brian
    [J]. PROCEEDINGS OF THE EYE TRACKING RESEARCH AND APPLICATIONS SYMPOSIUM (ETRA 2008), 2008, : 111 - 114
  • [38] Soukupova T., 2016, P 21 COMPUTER VISION
  • [39] The central fixation bias in scene viewing: Selecting an optimal viewing position independently of motor biases and image feature distributions
    Tatler, Benjamin W.
    [J]. JOURNAL OF VISION, 2007, 7 (14):
  • [40] Truong C., 2018, ARXIV, DOI DOI 10.48550/ARXIV.1801.00826