Gam360: sensing gaze activities of multi-persons in 360 degrees

被引:0
作者
Cai, Zhuojiang [1 ]
Wang, Haofei [2 ]
Niu, Yuhao [1 ]
Lu, Feng [1 ,2 ]
机构
[1] Beihang Univ, Sch Comp Sci & Engn, State Key Lab Virtual Real Technol & Syst, Beijing 100191, Peoples R China
[2] Peng Cheng Lab, Shenzhen 518055, Peoples R China
基金
中国国家自然科学基金; 北京市自然科学基金;
关键词
Gaze estimation; Eye tracking; Fisheye camera; 360-degree sensing;
D O I
10.1007/s42486-024-00168-7
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Eye gaze reveals one's attentional focus in social interaction. However, most existing gaze estimation systems can only sense gaze activities within a specific angular range for a particular individual, thus limiting their applicability in multi-person face-to-face scenarios. In this paper, we propose GAM360, a flexible system that enables 360-degree multi-person 3D gaze and screen point-of-gaze (PoG) estimation. Different from previous techniques, we build the system based on a bottom-up fisheye camera, which can sense gaze activities from users in different directions simultaneously. We develop a real-time multi-person 3D gaze estimation algorithm, along with a method for multi-person screen PoG estimation. Experimental results demonstrate that the system can accurately track the gaze of multiple users in 360 degrees, achieving a 3D gaze estimation angular error of 8.7 degrees +/- 4.9 degrees\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$8.7<^>\circ {\pm } 4.9<^>\circ$$\end{document} under free head movement. Furthermore, we validate the effectiveness of the system across various application scenarios, including group discussion, desktop environment, and classroom environment, providing new insights into ubiquitous gaze sensing.
引用
收藏
页码:174 / 187
页数:14
相关论文
共 52 条
[1]   Portable 3D Human Pose Estimation for Human-Human Interaction using a Chest-Mounted Fisheye Camera [J].
Aso, Kohei ;
Hwang, Dong-Hyun ;
Koike, Hideki .
PROCEEDINGS OF THE AUGMENTED HUMANS CONFERENCE 2021, AHS 2021, 2021, :116-120
[2]  
Bednarik R, 2009, J EYE MOVEMENT RES, V3
[3]   Local Standards for Sample Size at CHI [J].
Caine, Kelly .
34TH ANNUAL CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS, CHI 2016, 2016, :981-992
[4]  
Chen J., 2008, P 2008 19 INT C PATT, P1, DOI DOI 10.1109/ICPR.2008.4761343
[5]  
Cheng Y., 2021, ARXIV
[6]  
Cheng YH, 2020, AAAI CONF ARTIF INTE, V34, P10623
[7]   Gaze Estimation by Exploring Two-Eye Asymmetry [J].
Cheng, Yihua ;
Zhang, Xucong ;
Lu, Feng ;
Sato, Yoichi .
IEEE TRANSACTIONS ON IMAGE PROCESSING, 2020, 29 :5259-5272
[8]   Detecting Attended Visual Targets in Video [J].
Chong, Eunji ;
Wang, Yongxin ;
Ruiz, Nataniel ;
Rehg, James M. .
2020 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2020, :5395-5405
[9]   Multimodal Gaze Interaction for Creative Design [J].
Creed, Chris ;
Frutos-Pascual, Maite ;
Williams, Ian .
PROCEEDINGS OF THE 2020 CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS (CHI'20), 2020,
[10]   Monocular Free-head 3D Gaze Tracking with Deep Learning and Geometry Constraints [J].
Deng, Haoping ;
Zhu, Wangjiang .
2017 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2017, :3162-3171