Accelerating eye movement research via accurate and affordable smartphone eye tracking

被引:116
作者
Valliappan, Nachiappan [1 ]
Dai, Na [1 ]
Steinberg, Ethan [1 ,2 ]
He, Junfeng [1 ]
Rogers, Kantwon [1 ,3 ]
Ramachandran, Venky [1 ]
Xu, Pingmei [1 ]
Shojaeizadeh, Mina [1 ]
Guo, Li [1 ,4 ]
Kohlhoff, Kai [1 ]
Navalpakkam, Vidhya [1 ]
机构
[1] Google Res, Mountain View, CA 94043 USA
[2] Stanford Univ, Stanford, CA 94305 USA
[3] Georgia Inst Technol, Atlanta, GA 30332 USA
[4] Johns Hopkins Univ, Baltimore, MD USA
关键词
VISUAL-SEARCH; ATTENTION; STIMULUS;
D O I
10.1038/s41467-020-18360-5
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
Eye tracking has been widely used for decades in vision research, language and usability. However, most prior research has focused on large desktop displays using specialized eye trackers that are expensive and cannot scale. Little is known about eye movement behavior on phones, despite their pervasiveness and large amount of time spent. We leverage machine learning to demonstrate accurate smartphone-based eye tracking without any additional hardware. We show that the accuracy of our method is comparable to state-of-the-art mobile eye trackers that are 100x more expensive. Using data from over 100 opted-in users, we replicate key findings from previous eye movement research on oculomotor tasks and saliency analyses during natural image viewing. In addition, we demonstrate the utility of smartphone-based gaze for detecting reading comprehension difficulty. Our results show the potential for scaling eye movement research by orders-of-magnitude to thousands of participants (with explicit consent), enabling advances in vision research, accessibility and healthcare.
引用
收藏
页数:12
相关论文
共 63 条
  • [1] [Anonymous], 1967, Eye Movements and Vision, DOI [DOI 10.1007/978-1-4899-5379-78, 10.1007/978-1-4899-5379-7]
  • [2] [Anonymous], 2012, The Tobii I-VT Fixation Filter: Algorithm description White paper
  • [3] [Anonymous], 2018, GLOBAL DIGITAL USERS
  • [4] [Anonymous], 2005, Neurobiology of attention
  • [5] [Anonymous], 2005, COGNITIVE PROCESS, DOI DOI 10.1093/ACPROF:OSO/9780198566816.003.0011
  • [6] [Anonymous], 2019, TIME SPENT MEDIA 201
  • [7] THE ROLE OF ATTENTION IN DIFFERENT VISUAL-SEARCH TASKS
    BRAVO, MJ
    NAKAYAMA, K
    [J]. PERCEPTION & PSYCHOPHYSICS, 1992, 51 (05): : 465 - 472
  • [8] Learning Visual Importance for Graphic Designs and Data Visualizations
    Bylinskii, Zoya
    Kim, Nam Wook
    O'Donovan, Peter
    Alsheikh, Sami
    Madan, Spandan
    Pfister, Hanspeter
    Durand, Fredo
    Russell, Bryan
    Hertzmann, Aaron
    [J]. UIST'17: PROCEEDINGS OF THE 30TH ANNUAL ACM SYMPOSIUM ON USER INTERFACE SOFTWARE AND TECHNOLOGY, 2017, : 57 - 69
  • [9] Carpenter R. H. S., 1988, MOVEMENTS EYES
  • [10] Visual attention: The past 25 years
    Carrasco, Marisa
    [J]. VISION RESEARCH, 2011, 51 (13) : 1484 - 1525