gazeMapper: A tool for automated world-based analysis of gaze data from one or multiple wearable eye trackers

被引:1
作者
Niehorster, Diederick C. [1 ,2 ]
Hessels, Roy S. [3 ]
Nystrom, Marcus [1 ]
Benjamins, Jeroen S. [4 ]
Hooge, Ignace T. C. [3 ]
机构
[1] Lund Univ, Lund Univ Humanities Lab, Lund, Sweden
[2] Lund Univ, Dept Psychol, Lund, Sweden
[3] Univ Utrecht, Helmholtz Inst, Expt Psychol, Utrecht, Netherlands
[4] Univ Utrecht, Expt Psychol & Social Hlth & Org Psychol, Utrecht, Netherlands
关键词
Eye tracking; Wearable eye tracking; Mobile eye tracking; Eye movements; Gaze; Data quality; Head-fixed reference frame; World-fixed reference frame; Plane; Surface; Tool; VISUAL-ATTENTION; POSE ESTIMATION; TRACKING; HEAD; BEHAVIOR; MOVEMENTS; INFORMATION; ALGORITHM; STRATEGY; FIGHTERS;
D O I
10.3758/s13428-025-02704-4
中图分类号
B841 [心理学研究方法];
学科分类号
040201 ;
摘要
The problem: wearable eye trackers deliver eye-tracking data on a scene video that is acquired by a camera affixed to the participant's head. Analyzing and interpreting such head-centered data is difficult and laborious manual work. Automated methods to map eye-tracking data to a world-centered reference frame (e.g., screens and tabletops) are available. These methods usually make use of fiducial markers. However, such mapping methods may be difficult to implement, expensive, and eye tracker-specific. The solution: here we present gazeMapper, an open-source tool for automated mapping and processing of eye-tracking data. gazeMapper can: (1) Transform head-centered data to planes in the world, (2) synchronize recordings from multiple participants, (3) determine data quality measures, e.g., accuracy and precision. gazeMapper comes with a GUI application (Windows, macOS, and Linux) and supports 11 different wearable eye trackers from AdHawk, Meta, Pupil, SeeTrue, SMI, Tobii, and Viewpointsystem. It is also possible to sidestep the GUI and use gazeMapper as a Python library directly.
引用
收藏
页数:18
相关论文
共 116 条
[71]  
Lee G. A., 2017, ICAT EGVE 2017 INT C, P197, DOI DOI 10.2312/EGVE.20171359
[72]  
Leigh RJ., 2015, NEUROLOGY EYE MOVEME, DOI DOI 10.1093/MED/9780199969289.001.0001
[73]   Identification and classification of construction equipment operators' mental fatigue using wearable eye-tracking technology [J].
Li, Jue ;
Li, Heng ;
Umer, Waleed ;
Wang, Hongwei ;
Xing, Xuejiao ;
Zhao, Shukai ;
Hou, Jun .
AUTOMATION IN CONSTRUCTION, 2020, 109
[74]   Fast and globally convergent pose estimation from video images [J].
Lu, CP ;
Hager, GD ;
Mjolsness, E .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2000, 22 (06) :610-622
[75]   Gaze in a real-world social interaction: A dual eye-tracking study [J].
Macdonald, Ross G. ;
Tatler, Benjamin W. .
QUARTERLY JOURNAL OF EXPERIMENTAL PSYCHOLOGY, 2018, 71 (10) :2162-2173
[76]  
MacInnes Jeff., 2018, bioRxiv, DOI DOI 10.1101/299925
[77]   Early lifetime experience of urban living predicts social attention in real world crowds [J].
Maran, Thomas ;
Hoffmann, Alexandra ;
Sachse, Pierre .
COGNITION, 2022, 225
[78]  
Mardanbegi D., 2011, P 1 C NOV GAZ CONTR
[79]   The Usability of Mobile Applications for Pre-schoolers [J].
Masood, Mona ;
Thigambaram, Menaga .
7TH WORLD CONFERENCE ON EDUCATIONAL SCIENCES, 2015, 197 :1818-1826
[80]   Gaze and the Control of Foot Placement When Walking in Natural Terrain [J].
Matthis, Jonathan Samir ;
Yates, Jacob L. ;
Hayhoe, Mary M. .
CURRENT BIOLOGY, 2018, 28 (08) :1224-+