Applications of eye-tracking technologies in military vehicle environment

被引:0
作者
机构
来源
SOUTHEASTCON 2023 | 2023年
关键词
eye tracking; gaze; pupil; fixations; military; autonomous vehicles;
D O I
10.1109/SoutheastCon51012.2023.10115082
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Eye tracking technologies are becoming ubiquitous in the modern world and their use cases range from mobile devices to medical tests, from emotion recognition to operating vehicles and aircraft. However, the eye-tracking data acquisition process is still vulnerable and prone to multiple inconsistencies when reading gaze features. In this paper, we survey the most common challenges encountered when gathering gaze data using all types of state-of-the-art sensors and provide an overview of eye-tracking data applications in relation to the military vehicle environment.
引用
收藏
页码:761 / 765
页数:5
相关论文
共 31 条
[1]   Processing of Eye/Head-Tracking Data in Large-Scale Naturalistic Driving Data Sets [J].
Ahlstrom, Christer ;
Victor, Trent ;
Wege, Claudia ;
Steinmetz, Erik .
IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2012, 13 (02) :553-564
[2]  
[Anonymous], HYUNDAI HCD 14 GENES
[3]  
[Anonymous], TOBII PRONANO
[4]  
[Anonymous], APACHE ATTACK HELICO
[5]  
[Anonymous], THE 3
[6]  
[Anonymous], EyeLink 1000 Plus
[7]  
[Anonymous], EYE
[8]   Estimating Pilots' Cognitive Load From Ocular Parameters Through Simulation and In-Flight Studies [J].
Babu, M. Dilli ;
Shree, Jeevitha D., V ;
Prabhakar, Gowdham ;
Saluja, Kamal Preet Singh ;
Pashilkar, Abhay ;
Biswas, Pradipta .
JOURNAL OF EYE MOVEMENT RESEARCH, 2019, 12 (03)
[9]   Prediction of effort and eye movement measures from driving scene components [J].
Cabrall, Christopher D. D. ;
Happee, Riender ;
de Winter, Joost C. F. .
TRANSPORTATION RESEARCH PART F-TRAFFIC PSYCHOLOGY AND BEHAVIOUR, 2020, 68 :187-197
[10]   Pupil detection for head-mounted eye tracking in the wild: an evaluation of the state of the art [J].
Fuhl, Wolfgang ;
Tonsen, Marc ;
Bulling, Andreas ;
Kasneci, Enkelejda .
MACHINE VISION AND APPLICATIONS, 2016, 27 (08) :1275-1288