An Eye-Tracker-Based 3D Point-of-Gaze Estimation Method Using Head Movement

被引:7
|
作者
Pichitwong, Wudthipong [1 ]
Chamnongthai, Kosin [1 ]
机构
[1] King Mongkuts Univ Technol Thonburi, Dept Elect & Telecommun Engn, Bangkok 10140, Thailand
来源
IEEE ACCESS | 2019年 / 7卷
关键词
3D gaze estimation; gaze tracking; eye tracker;
D O I
10.1109/ACCESS.2019.2929195
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Eye trackers are currently used to sense the positions of both the centers of the pupils and the point-of-gaze (POG) position on a screen, in keeping with the original objective for which they were designed; however, it remains difficult to measure the positions of three-dimensional (3D) POGs. This paper proposes a method for 3D gaze estimation by using head movement, pupil position data, and POGs on a screen. The method assumes that a person, usually unintentionally, moves his or her head a short distance such that multiple straight lines can be drawn from the center point between the two pupils to the POG. When the person is continuously focusing on a given 3D POG while moving, these lines represent the lines of sight that intersect at a 3D POG . That 3D POG can, therefore, be found from the intersection of several lines of sight formed by head movements. To evaluate the performance of the proposed method, experimental equipment was constructed, and experiments with five male and five female participants were performed in which the participants looked at nine test points in a 3D space for approximately 20 s each. The experimental results reveal that the proposed method can measure 3D POGs with average distance errors of 13.36 cm, 7.58 cm, 5.72 cm, 3.97 cm, and 3.52 cm for head movement distances of 1 cm, 2 cm, 3 cm, 4 cm, and 5 cm, respectively.
引用
收藏
页码:99086 / 99098
页数:13
相关论文
共 10 条
  • [1] SLAM-based Localization of 3D Gaze using a Mobile Eye Tracker
    Wang, Haofei
    Pi, Jimin
    Qin, Tong
    Shen, Shaojie
    Shi, Bertram E.
    2018 ACM SYMPOSIUM ON EYE TRACKING RESEARCH & APPLICATIONS (ETRA 2018), 2018,
  • [2] User-Calibration-Free Gaze Estimation Method Using a Binocular 3D Eye Model
    Nagamatsu, Takashi
    Sugano, Ryuichi
    Iwamoto, Yukina
    Kamahara, Junzo
    Tanaka, Naoki
    IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2011, E94D (09): : 1817 - 1829
  • [3] A 3D Gaze Estimation Method Based on Facial Feature Tracking
    Zhao, Xinbo
    Zou, Xiaochun
    Chi, Zheru
    2012 INTERNATIONAL CONFERENCE ON COMPUTERIZED HEALTHCARE (ICCH), 2012, : 13 - 16
  • [4] 3D Gaze Estimation Based on Facial Feature Tracking
    Man, Yi
    Zhao, Xinbo
    Zhang, Ke
    INTERNATIONAL CONFERENCE ON GRAPHIC AND IMAGE PROCESSING (ICGIP 2012), 2013, 8768
  • [5] Accurate Regression-Based 3D Gaze Estimation Using Multiple Mapping Surfaces
    Wan, Zhonghua
    Xiong, Caihua
    Li, Quanlin
    Chen, Wenbin
    Wong, Kelvin Kian Loong
    Wu, Shiqian
    IEEE ACCESS, 2020, 8 : 166460 - 166471
  • [6] Robust and accurate 2D-tracking-based 3D positioning method: Application to head pose estimation
    Ariz, Mikel
    Villanueva, Arantxa
    Cabeza, Rafael
    COMPUTER VISION AND IMAGE UNDERSTANDING, 2019, 180 : 13 - 22
  • [7] Realtime and Accurate 3D Eye Gaze Capture with DCNN-Based Iris and Pupil Segmentation
    Wang, Zhiyong
    Chai, Jinxiang
    Xia, Shihong
    IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS, 2021, 27 (01) : 190 - 203
  • [8] Measuring gaze angle changes to maintain fixation upon a small target during motion: 3D motion tracking versus wearable eye-tracker
    Baranano, Alejandro Rubio
    Barrett, Brendan T.
    Buckley, John G.
    MEASUREMENT, 2024, 225
  • [9] Iris Feature-Based 3-D Gaze Estimation Method Using a One-Camera-One-Light-Source System
    Liu, Jiahui
    Chi, Jiannan
    Lu, Ning
    Yang, Zuoyun
    Wang, Zhiliang
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2020, 69 (07) : 4940 - 4954
  • [10] Deep Learning-Based Eye-Tracking Analysis for Diagnosis of Alzheimer's Disease Using 3D Comprehensive Visual Stimuli
    Zuo, Fangyu
    Jing, Peiguang
    Sun, Jinglin
    Duan, Jizhong
    Ji, Yong
    Liu, Yu
    IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS, 2024, 28 (05) : 2781 - 2793