Towards gaze-based prediction of the intent to interact in virtual reality

被引:44
作者
David-John, Brendan [1 ]
Peacock, Candace E. [1 ]
Zhang, Ting [1 ]
Murdison, T. Scott [2 ]
Benko, Hrvoje [1 ]
Jonker, Tanya R. [1 ]
机构
[1] Facebook Real Labs Res, Redmond, WA 98052 USA
[2] Facebook Real Labs, Redmond, WA USA
来源
ACM SYMPOSIUM ON EYE TRACKING RESEARCH AND APPLICATIONS, ETRA 2021 | 2021年
关键词
intent prediction; eye tracking; mixed reality; virtual reality; interaction; EYE-MOVEMENTS; USER INTERFACES;
D O I
10.1145/3448018.3458008
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
With the increasing frequency of eye tracking in consumer products, including head-mounted augmented and virtual reality displays, gaze-based models have the potential to predict user intent and unlock intuitive new interaction schemes. In the present work, we explored whether gaze dynamics can predict when a user intends to interact with the real or digital world, which could be used to develop predictive interfaces for low-effort input. Eye-tracking data were collected from 15 participants performing an item-selection task in virtual reality. Using logistic regression, we demonstrated successful prediction of the onset of item selection. The most prevalent predictive features in the model were gaze velocity, ambient/focal attention, and saccade dynamics, demonstrating that gaze features typically used to characterize visual attention can be applied to model interaction intent. In the future, these types of models can be used to infer user's near-term interaction goals and drive ultra-low-friction predictive interfaces.
引用
收藏
页数:7
相关论文
共 44 条
[1]  
Alghofaili R, 2019, 2019 26TH IEEE CONFERENCE ON VIRTUAL REALITY AND 3D USER INTERFACES (VR), P464, DOI [10.1109/vr.2019.8797816, 10.1109/VR.2019.8797816]
[2]   Lost in Style: Gaze-driven Adaptive Aid for VR Navigation [J].
Alghofaili, Rawan ;
Sawahata, Yasuhito ;
Huang, Haikun ;
Wang, Hsueh-Cheng ;
Shiratori, Takaaki ;
Yu, Lap-Fai .
CHI 2019: PROCEEDINGS OF THE 2019 CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS, 2019,
[3]   MEMORY REPRESENTATIONS IN NATURAL TASKS [J].
BALLARD, DH ;
HAYHOE, MM ;
PELZ, JB .
JOURNAL OF COGNITIVE NEUROSCIENCE, 1995, 7 (01) :66-80
[4]  
Bednarik R., 2012, P S EYE TRACK RES AP, P83, DOI [10.1145/2168556.2168569, DOI 10.1145/2168556.2168569EVENT-PLACE, DOI 10.1145/2168556.2168569]
[5]   State-of-the-Art in Visual Attention Modeling [J].
Borji, Ali ;
Itti, Laurent .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2013, 35 (01) :185-207
[6]   The use of the area under the roc curve in the evaluation of machine learning algorithms [J].
Bradley, AP .
PATTERN RECOGNITION, 1997, 30 (07) :1145-1159
[7]   Predicting Mid-Air Interaction Movements and Fatigue Using Deep Reinforcement Learning [J].
Cheema, Noshaba ;
Frey-Law, Laura A. ;
Naderi, Kourosh ;
Lehtinen, Jaakko ;
Slusallek, Philipp ;
Hamalainen, Perttu .
PROCEEDINGS OF THE 2020 CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS (CHI'20), 2020,
[8]   Risk prediction model using eye movements during simulated driving with logistic regressions and neural networks [J].
Costela, Francisco M. ;
Castro-Torres, Jose J. .
TRANSPORTATION RESEARCH PART F-TRAFFIC PSYCHOLOGY AND BEHAVIOUR, 2020, 74 :511-521
[9]  
Davis JS, 2006, PROCEEDINGS OF THE 1ST INTERNATIONAL CONFERENCE ON THE ECOLOGICAL IMPORTANCE OF SOLAR SALTWORKS, P5
[10]   Real-time recording and classification of eye movements in an immersive virtual environment [J].
Diaz, Gabriel ;
Cooper, Joseph ;
Kit, Dmitry ;
Hayhoe, Mary .
JOURNAL OF VISION, 2013, 13 (12)