Gaze Augmentation in Egocentric Video Improves Awareness of Intention

被引:11
作者
Akkil, Deepak [1 ]
Isokoski, Poika [1 ]
机构
[1] Univ Tampere, Tampere Unit Comp Human Interact, FIN-33101 Tampere, Finland
来源
34TH ANNUAL CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS, CHI 2016 | 2016年
关键词
Video-based collaboration; Gaze tracking; Wearable computing; Intention prediction; EYE-MOVEMENTS; MEMORY;
D O I
10.1145/2858036.2858127
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Video communication using head-mounted cameras could be useful to mediate shared activities and support collaboration. Growing popularity of wearable gaze trackers presents an opportunity to add gaze information on the egocentric video. We hypothesized three potential benefits of gaze-augmented egocentric video to support collaborative scenarios: support deictic referencing, enable grounding in communication, and enable better awareness of the collaborator's intentions. Previous research on using egocentric videos for real-world collaborative tasks has failed to show clear benefits of gaze point visualization. We designed a study, deconstructing a collaborative car navigation scenario, to specifically target the value of gaze augmented video for intention prediction. Our results show that viewers of gaze-augmented video could predict the direction taken by a driver at a four-way intersection more accurately and more confidently than a viewer of the same video without the superimposed gaze point. Our study demonstrates that gaze augmentation can be useful and encourages further study in real-world collaborative scenarios.
引用
收藏
页码:1573 / 1584
页数:12
相关论文
共 38 条
[1]  
[Anonymous], 2003, P SIGCHI C HUM FACT
[2]  
[Anonymous], USING EYE TRAC UNPUB
[3]  
[Anonymous], 2002, CHI '02 Extended Abstracts on Human Factors in Computing Systems CHI EA '02, DOI [DOI 10.1145/506443.506634., DOI 10.1145/506443.506634]
[4]   Coordinating cognition: The costs and benefits of shared gaze during collaborative search [J].
Brennan, Susan E. ;
Chen, Xin ;
Dickinson, Christopher A. ;
Neider, Mark B. ;
Zelinsky, Gregory J. .
COGNITION, 2008, 106 (03) :1465-1477
[5]  
Brubaker JedR., 2012, P DESIGNING INTERACT, P96, DOI DOI 10.1145/2317956.2317973
[6]  
Bulling Andreas, 2010, IEEE PERVASIVE COMPU, V9
[7]   CONTROL OF EYE FIXATION BY MEANING OF SPOKEN LANGUAGE - NEW METHODOLOGY FOR REAL-TIME INVESTIGATION OF SPEECH PERCEPTION, MEMORY, AND LANGUAGE PROCESSING [J].
COOPER, RM .
COGNITIVE PSYCHOLOGY, 1974, 6 (01) :84-107
[8]   On the Roles of Eye Gaze and Head Dynamics in Predicting Driver's Intent to Change Lanes [J].
Doshi, Anup ;
Trivedi, Mohan Manubhai .
IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2009, 10 (03) :453-462
[9]   Randomization tests: A new gold standard? [J].
Dugard, Pat .
JOURNAL OF CONTEXTUAL BEHAVIORAL SCIENCE, 2014, 3 (01) :65-68
[10]   The where, what and when of gaze allocation in the lab and the natural environment [J].
Foulsham, Tom ;
Walker, Esther ;
Kingstone, Alan .
VISION RESEARCH, 2011, 51 (17) :1920-1931