Fixation precision in high-speed noncontact eye-gaze tracking

被引:28
|
作者
Hennessey, Craig [1 ]
Noureddin, Borna [1 ]
Lawrence, Peter [1 ]
机构
[1] Univ British Columbia, Vancouver, BC V6T 1Z4, Canada
基金
加拿大自然科学与工程研究理事会;
关键词
eye-gaze tracking; fixation precision; head free; high speed; human-computer interface; noncontact; remote;
D O I
10.1109/TSMCB.2007.911378
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The precision of point-of-gaze (POG) estimation during a fixation is an important factor in determining the usability of a noncontact eye-gaze tracking system for real-time applications. The objective of this paper is to define and measure POG fixation precision, propose methods for increasing the fixation precision, and examine the improvements when the methods are applied to two POG estimation approaches. To achieve these objectives, techniques for high-speed image processing that allow POG sampling rates of over 400 Hz are presented. With these high-speed POG sampling rates, the fixation precision can be improved by filtering while maintaining an acceptable real-time latency. The high-speed sampling and digital filtering techniques developed were applied to two POG estimation techniques, i.e., the highspeed pupil-corneal reflection (HS P-CR) vector method and a 3-D model-based method allowing free head motion. Evaluation on the subjects has shown that when operating at 407 frames per second (fps) with filtering, the fixation precision for the HS P-CR POG estimation method was improved by a factor of 5.8 to 0.035 degrees (1.6 screen pixels) compared to the unfiltered operation at 30 fps. For the 3-D POG estimation method, the fixation precision was improved by a factor of 11 to 0.050 degrees (2.3 screen pixels) compared to the unfiltered operation at 30 fps.
引用
收藏
页码:289 / 298
页数:10
相关论文
共 50 条
  • [41] An affective user interface based on facial expression recognition and eye-gaze tracking
    Choi, SM
    Kim, YG
    AFFECTIVE COMPUTING AND INTELLIGENT INTERACTION, PROCEEDINGS, 2005, 3784 : 907 - 914
  • [42] High-speed nonlinear discrete tracking-differentiator with high precision
    Xie, Yun-De
    Long, Zhi-Qiang
    Kongzhi Lilun Yu Yingyong/Control Theory and Applications, 2009, 26 (02): : 127 - 132
  • [43] Binocular Vision Impairments Therapy Supported by Contactless Eye-Gaze Tracking System
    Kosikowski, Lukasz
    Czyzewski, Andrzej
    COMPUTERS HELPING PEOPLE WITH SPECIAL NEEDS, PROCEEDINGS, PT 2, 2010, 6180 : 373 - 376
  • [44] High-speed Gaze-oriented Projection by Cross-ratio-based Eye Tracking with Dual Infrared Imaging
    Matsumoto, Ayumi
    Sueishi, Tomohiro
    Ishikawa, Masatoshi
    2022 IEEE CONFERENCE ON VIRTUAL REALITY AND 3D USER INTERFACES ABSTRACTS AND WORKSHOPS (VRW 2022), 2022, : 585 - 586
  • [45] Cross-talk elimination for lenslet array near eye display based on eye-gaze tracking
    Ye, Bi
    Fujimoto, Yuichiro
    Uchimine, Yuta
    Sawabe, Taishi
    Kanbara, Masayuki
    Kato, Hirokazu
    OPTICS EXPRESS, 2022, 30 (10) : 16196 - 16216
  • [46] Eye-Gaze Tracking Based on Head Orientation Estimation Using FMCW Radar Sensor
    Jung, Jaehoon
    Kim, Jihye
    Kim, Seong-Cheol
    Lim, Sohee
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2024, 73
  • [47] How impression formation influences eye gaze fixation? An eye tracking investigation
    Tuladhar, Viplav
    Maheshwari, Saurabh
    INTERNATIONAL JOURNAL OF PSYCHOLOGY, 2023, 58 : 961 - 962
  • [48] Improving the Accuracy and Reliability of Remote System-Calibration-Free Eye-Gaze Tracking
    Hennessey, Craig A.
    Lawrence, Peter D.
    IEEE TRANSACTIONS ON BIOMEDICAL ENGINEERING, 2009, 56 (07) : 1891 - 1900
  • [49] Real-time motorized electrical hospital bed control with eye-gaze tracking
    Aydin Atasoy, Nesrin
    Cavusoglu, Abdullah
    Atasoy, Ferhat
    TURKISH JOURNAL OF ELECTRICAL ENGINEERING AND COMPUTER SCIENCES, 2016, 24 (06) : 5162 - +
  • [50] Eye-gaze Tracking Method Driven by Raspberry PI Applicable in Automotive Traffic Safety
    Stan, Ovidiu
    Miclea, Liviu
    Centea, Ana
    2014 2ND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE, MODELLING AND SIMULATION, 2014, : 126 - 130