3D Gaze Estimation for Head-Mounted Eye Tracking System With Auto-Calibration Method

被引:27
作者
Liu, Meng [1 ]
Li, Youfu [1 ]
Liu, Hai [1 ,2 ]
机构
[1] City Univ Hong Kong, Dept Mech Engn, Hong Kong, Peoples R China
[2] Cent China Normal Univ, Natl Engn Res Ctr E Learning, Wuhan 430079, Peoples R China
来源
IEEE ACCESS | 2020年 / 8卷
基金
中国国家自然科学基金;
关键词
Head-mounted gaze tracking system; saliency maps; auto-calibration; 3D gaze estimation; CALIBRATION;
D O I
10.1109/ACCESS.2020.2999633
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The general challenges of 3D gaze estimation for head-mounted eye tracking systems are inflexible marker-based calibration procedure and significant errors of depth estimation. In this paper, we propose a 3D gaze estimation with an auto-calibration method. To acquire the accurate 3D structure of the environment, an RGBD camera is applied as the scene camera of our system. By adopting the saliency detection method, saliency maps can be acquired through scene images, and 3D salient pixels in the scene are considered potential 3D calibration targets. The 3D eye model is built on the basis of eye images to determine gaze vectors. By combining 3D salient pixels and gaze vectors, the auto-calibration can be achieved with our calibration method. Finally, the 3D gaze point is obtained through the calibrated gaze vectors, and the point cloud is generated from the RGBD camera. The experimental result shows that the proposed system can achieve an average accuracy of 3.7 degrees in the range of 1 m to 4 m indoors and 4.0 degrees outdoors. The proposed system also presents a great improvement in depth measurement, which is sufficient for tracking users' visual attention in real scenes.
引用
收藏
页码:104207 / 104215
页数:9
相关论文
共 35 条
[1]   Auto-Calibrated Gaze Estimation Using Human Gaze Patterns [J].
Alnajar, Fares ;
Gevers, Theo ;
Valenti, Roberto ;
Ghebreab, Sennay .
INTERNATIONAL JOURNAL OF COMPUTER VISION, 2017, 124 (02) :223-236
[2]   A Regression-Based User Calibration Framework for Real-Time Gaze Estimation [J].
Arar, Nuri Murat ;
Gao, Hua ;
Thiran, Jean-Philippe .
IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2017, 27 (12) :2623-2638
[3]  
Bace M., 2018, PROC ACM S EYE TRACK, P1
[4]   Saliency Detection for Stereoscopic Images Based on Depth Confidence Analysis and Multiple Cues Fusion [J].
Cong, Runmin ;
Lei, Jianjun ;
Zhang, Changqing ;
Huang, Qingming ;
Cao, Xiaochun ;
Hou, Chunping .
IEEE SIGNAL PROCESSING LETTERS, 2016, 23 (06) :819-823
[5]   Going From RGB to RGBD Saliency: A Depth-Guided Transformation Model [J].
Cong, Runmin ;
Lei, Jianjun ;
Fu, Huazhu ;
Hou, Junhui ;
Huang, Qingming ;
Kwong, Sam .
IEEE TRANSACTIONS ON CYBERNETICS, 2020, 50 (08) :3627-3639
[6]   Review of Visual Saliency Detection With Comprehensive Information [J].
Cong, Runmin ;
Lei, Jianjun ;
Fu, Huazhu ;
Cheng, Ming-Ming ;
Lin, Weisi ;
Huang, Qingming .
IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2019, 29 (10) :2941-2959
[7]   A Novel Eye-Gaze-Controlled Wheelchair System for Navigating Unknown Environments: Case Study With a Person With ALS [J].
Eid, Mohamad A. ;
Giakoumidis, Nikolas ;
El Saddik, Abdulmotaleb .
IEEE ACCESS, 2016, 4 :558-573
[8]   3D gaze estimation in the scene volume with a head-mounted eye tracker [J].
Elmadjian, Carlos ;
Shukla, Pushkar ;
Tula, Antonio Diaz ;
Morimoto, Carlos H. .
COMMUNICATION BY GAZE INTERACTION (COGAIN 2018), 2018,
[9]  
Ester M., 1996, KDD-96 Proceedings. Second International Conference on Knowledge Discovery and Data Mining, P226
[10]  
Harel Jonathan, 2007, ADV NEURAL INFORM PR, P545, DOI DOI 10.7551/MITPRESS/7503.001.0001