Revolutionizing Gaze-Based Human-Computer Interaction Using Iris Tracking: A Webcam-Based Low-Cost Approach With Calibration, Regression and Real-Time Re-Calibration

被引:1
|
作者
Chhimpa, Govind Ram [1 ]
Kumar, Ajay [2 ]
Garhwal, Sunita [2 ]
Dhiraj, Faheem
Khan, Faheem [3 ]
Moon, Yeon-Kug [4 ]
机构
[1] Manipal Univ, Dept Internet Things & Intelligent Syst, Jaipur 303007, Rajasthan, India
[2] Thapar Inst Engn & Technol, Comp Sci & Engn, Patiala 147001, Punjab, India
[3] Gachon Univ, Dept Comp Engn, Seongnam Si 13120, South Korea
[4] Sejong Univ, Dept Artificial Intelligence & Data Sci, Seoul 05006, South Korea
来源
IEEE ACCESS | 2024年 / 12卷
关键词
Human-computer interaction; eye-gaze tracking; iris-tracking; calibration; regression; low-cost; real-time re-calibration; EYE-TRACKING;
D O I
10.1109/ACCESS.2024.3498441
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Eye movements are essential in human-computer interaction (HCI) because they offer insights into individuals' cognitive states and visual attention. Techniques for adequately assessing gaze have increased in the last two decades. Notably, video-based tracking methods have gained considerable interest within the research community due to their nonintrusive nature, enabling precise and convenient gaze estimation without physical contact or invasive measures. This paper introduces a video-based gaze-tracking method that presents an affordable, user-friendly, and dependable human-computer interaction (HCI) system based on iris movement. By utilizing the MediaPipe face mesh model, facial features are extracted from real-time video sequences. A 5-point user-specific calibration and multiple regression techniques are employed to predict the gaze point on the screen accurately. The proposed system effectively handles changes in body position and user posture through real-time re-calibration using z-index tracking. Furthermore, it compensates for minor head movements that may introduce inaccuracies. The proposed system is cost-effective, with a general cost below $25{\$}$ , which may vary based on camera usage. Thirteen participants were involved in the system testing. The system demonstrates a high level of sensitivity to low light conditions, a strong response to changes in distance, and a moderate reaction to glasses, with an average frame processing time of 0.047 seconds. On average, it achieves a visual angle accuracy of 1.12 degrees with head movement and 1.3 degrees without head movement.
引用
收藏
页码:168256 / 168269
页数:14
相关论文
共 5 条
  • [1] Real-Time Gaze Estimation Using Webcam-Based CNN Models for Human-Computer Interactions
    Vidhya, Visal
    Resende Faria, Diego
    COMPUTERS, 2025, 14 (02)
  • [2] CalibMe: Fast and Unsupervised Eye Tracker Calibration for Gaze-Based Pervasive Human-Computer Interaction
    Santini, Thiago
    Fuhl, Wolfgang
    Kasneci, Enkelejda
    PROCEEDINGS OF THE 2017 ACM SIGCHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS (CHI'17), 2017, : 2594 - 2605
  • [3] Real-time human-computer interface based on eye gaze estimation from low-quality webcam images: integration of convolutional neural networks, calibration, and transfer learning
    Chhimpa, Govind R.
    Kumar, Ajay
    Garhwal, Sunita
    Kumar, Dhiraj
    DIGITAL SCHOLARSHIP IN THE HUMANITIES, 2025, : 64 - 74
  • [4] Sens-BERT: A BERT-Based Approach for Enabling Transferability and Re-Calibration of Calibration Models for Low-Cost Sensors Under Reference Measurements Scarcity
    Narayana, M. V.
    Rachavarapu, Kranthi Kumar
    Jalihal, Devendra
    Nagendra, S. M. Shiva
    IEEE SENSORS JOURNAL, 2024, 24 (07) : 11362 - 11373
  • [5] Design and Implementation of Human-Computer Interaction System Based on Real-Time Tracking of the Shoot Point from the Light Pen
    Ye, Lexiao
    Wang, Yigang
    PROCEEDINGS OF 2009 INTERNATIONAL CONFERENCE ON IMAGE ANALYSIS AND SIGNAL PROCESSING, 2009, : 245 - +