A real-time camera-based gaze-tracking system involving dual interactive modes and its application in gaming

被引:1
|
作者
Zhang, He [1 ]
Yin, Lu [1 ]
Zhang, Hanling [1 ]
机构
[1] Hunan Univ, Sch Design, Changsha 410082, Peoples R China
基金
中国国家自然科学基金;
关键词
Camera-based eye gaze interaction; Head gaze interaction; Human-computer interaction; Computer vision; User experience;
D O I
10.1007/s00530-023-01204-9
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Eye-tracking and head-tracking techniques have been applied in many fields, including human-computer interaction, gaming, virtual reality (VR), and the medical. In these applications, users must wear special hardware devices such as eye trackers and head-mounted devices. However, these devices are high-priced, operating them may be complicated, and users may feel uncomfortable wearing them. Then, how can we track eye movements and head movements in real time without these devices? In this paper, we present a real-time camera-based gaze-tracking system that provides two interactive modes: eye gaze and head gaze. The system uses the same calibration procedures to project the gaze directions of the eyes or head to the screen coordinates. Then, we designed a 9-point circular interface to examine the accuracy. Eye gaze and head gaze achieved a visual angle error of 1.76 and 2.65 degrees, respectively. They were comparable to commercial eye-trackers. We also applied the system to a game and verified its effectiveness in the realm of interaction by analyzing the user experience and game score under different interactive modes. The experimental results showed users could get a similar score as the keyboard using eye gaze and feel more immersive under head gaze. Our findings can help to provide more funny choices for users to interact with computers.
引用
收藏
页数:14
相关论文
共 33 条
  • [31] Design of a Low-Power Embedded System Based on a SoC-FPGA and the Honeybee Search Algorithm for Real-Time Video Tracking
    Soubervielle-Montalvo, Carlos
    Perez-Cham, Oscar E.
    Puente, Cesar
    Gonzalez-Galvan, Emilio J.
    Olague, Gustavo
    Aguirre-Salado, Carlos A.
    Cuevas-Tello, Juan C.
    Ontanon-Garcia, Luis J.
    SENSORS, 2022, 22 (03)
  • [32] Revolutionizing Gaze-Based Human-Computer Interaction Using Iris Tracking: A Webcam-Based Low-Cost Approach With Calibration, Regression and Real-Time Re-Calibration
    Chhimpa, Govind Ram
    Kumar, Ajay
    Garhwal, Sunita
    Dhiraj, Faheem
    Khan, Faheem
    Moon, Yeon-Kug
    IEEE ACCESS, 2024, 12 : 168256 - 168269
  • [33] SARTAB, a scalable system for automated real-time behavior detection based on animal tracking and Region Of Interest analysis: validation on fish courtship behavior
    Lancaster, Tucker J.
    Leatherbury, Kathryn N.
    Shilova, Kseniia
    Streelman, Jeffrey T.
    Mcgrath, Patrick T.
    FRONTIERS IN BEHAVIORAL NEUROSCIENCE, 2024, 18