Algorithms for the automated correction of vertical drift in eye-tracking data

被引:0
|
作者
Jon W. Carr
Valentina N. Pescuma
Michele Furlan
Maria Ktori
Davide Crepaldi
机构
[1] International School for Advanced Studies (SISSA),
来源
Behavior Research Methods | 2022年 / 54卷
关键词
Algorithms; Dynamic time warping; Eye tracking; Line assignment; Reading; Vertical drift;
D O I
暂无
中图分类号
学科分类号
摘要
A common problem in eye-tracking research is vertical drift—the progressive displacement of fixation registrations on the vertical axis that results from a gradual loss of eye-tracker calibration over time. This is particularly problematic in experiments that involve the reading of multiline passages, where it is critical that fixations on one line are not erroneously recorded on an adjacent line. Correction is often performed manually by the researcher, but this process is tedious, time-consuming, and prone to error and inconsistency. Various methods have previously been proposed for the automated, post hoc correction of vertical drift in reading data, but these methods vary greatly, not just in terms of the algorithmic principles on which they are based, but also in terms of their availability, documentation, implementation languages, and so forth. Furthermore, these methods have largely been developed in isolation with little attempt to systematically evaluate them, meaning that drift correction techniques are moving forward blindly. We document ten major algorithms, including two that are novel to this paper, and evaluate them using both simulated and natural eye-tracking data. Our results suggest that a method based on dynamic time warping offers great promise, but we also find that some algorithms are better suited than others to particular types of drift phenomena and reading behavior, allowing us to offer evidence-based advice on algorithm selection.
引用
收藏
页码:287 / 310
页数:23
相关论文
共 50 条
  • [31] Skill, or Style? Classification of Fetal Sonography Eye-Tracking Data
    Teng, Clare
    Drukker, Lior
    Papageorghiou, Aris T.
    Noble, J. Alison
    GAZE MEETS MACHINE LEARNING WORKSHOP, VOL 210, 2022, 210 : 184 - 198
  • [32] A study on surprisal and semantic relatedness for eye-tracking data prediction
    Salicchi, Lavinia
    Chersoni, Emmanuele
    Lenci, Alessandro
    FRONTIERS IN PSYCHOLOGY, 2023, 14
  • [33] ASSESSMENT OF SITUATION AWARENESS FOR SEAFARERS USING EYE-TRACKING DATA
    Virdi, S. S.
    Ng, Yong Thiang
    Liu, Yisi
    Tan, Kelvin
    Zhang, Daniel
    PROCEEDINGS OF ASME 2022 41ST INTERNATIONAL CONFERENCE ON OCEAN, OFFSHORE & ARCTIC ENGINEERING, OMAE2022, VOL 1, 2022,
  • [34] A new data processing and calibration method for an eye-tracking device
    Lin, CS
    Chang, KC
    King, HJ
    Wu, CC
    Chen, CH
    HELMET- AND HEAD-MOUNTED DISPLAYS V, 2000, 4021 : 356 - 363
  • [35] How Effective Are Eye-Tracking Data in Identifying Problematic Questions?
    Neuert, Cornelia E.
    SOCIAL SCIENCE COMPUTER REVIEW, 2020, 38 (06) : 793 - 802
  • [36] NMF-Based Analysis of Mobile Eye-Tracking Data
    Kloetzl, Daniel
    Krake, Tim
    Heyen, Frank
    Becher, Michael
    Koch, Maurice
    Weiskopf, Daniel
    Kurzhals, Kuno
    PROCEEDINGS OF THE 2024 ACM SYMPOSIUM ON EYE TRACKING RESEARCH & APPLICATIONS, ETRA 2024, 2024,
  • [37] A privacy-preserving approach to streaming eye-tracking data
    David-John, Brendan
    Hosfelt, Diane
    Butler, Kevin
    Jain, Eakta
    IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS, 2021, 27 (05) : 2555 - 2565
  • [38] Reporting Eye-Tracking Data Quality: Towards a New Standard
    Jakobi, Deborah N.
    Krakowczyk, Daniel G.
    Jager, Lena A.
    PROCEEDINGS OF THE 2024 ACM SYMPOSIUM ON EYE TRACKING RESEARCH & APPLICATIONS, ETRA 2024, 2024,
  • [39] Eye-tracking data quality as affected by ethnicity and experimental design
    Pieter Blignaut
    Daniël Wium
    Behavior Research Methods, 2014, 46 : 67 - 80
  • [40] The Perception Engineer's Toolkit for Eye-Tracking data analysis
    Kuebler, Thomas C.
    ETRA 2020 SHORT PAPERS: ACM SYMPOSIUM ON EYE TRACKING RESEARCH & APPLICATIONS, 2020,