Real Time Eye Gaze Tracking for Human Machine Interaction in the Cockpit

被引:1
作者
Turetkin, Engin [1 ]
Saeedi, Sareh [1 ]
Bigdeli, Siavash [1 ]
Stadelmann, Patrick [1 ]
Cantale, Nicolas [1 ]
Lutnyk, Luis [2 ]
Raubal, Martin [2 ]
Dunbar, L. Andrea [1 ]
机构
[1] CSEM SA, Edge AI & Vis Grp, Rue Jaquet Droz 1, CH-2002 Neuchatel, Switzerland
[2] Swiss Fed Inst Technol, Inst Cartog & Geoinformat, Stefano Franscini Pl 5, CH-8093 Zurich, Switzerland
来源
AI AND OPTICAL DATA SCIENCES III | 2022年 / 12019卷
基金
欧盟地平线“2020”;
关键词
Gaze-based interaction; Eye gaze detection; Aviation; Computer vision; Machine learning; Human-machine interaction;
D O I
10.1117/12.2607434
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The Aeronautics industry has pioneered safety from digital checklists to moving maps that improve pilot situational awareness and support safe ground movements. Today, pilots deal with increasingly complex cockpit environments and air traffic densification. Here we present an intelligent vision system, which allows real-time human-machine interaction in the cockpits to reduce pilot's workload. The challenges for such a vision system include extreme change in background light intensity, large field-of-view and variable working distances. Adapted hardware, use of state-of-the-art computer vision techniques and machine learning algorithms in eye gaze detection allow a smooth, and accurate real-time feedback system. The current system has been over-specified to explore the optimized solutions for different use-cases. The algorithmic pipeline for eye gaze tracking was developed and iteratively optimized to obtain the speed and accuracy required for the aviation use cases. The pipeline, which is a combination of data-driven and analytics approaches, runs in real time at 60 fps with a latency of about 32ms. The eye gaze estimation error was evaluated in terms of the point of regard distance error with respect to the 3D point location. An average error of less than 1.1cm was achieved over 28 gaze points representing the cockpit instruments placed at about 80-110cm from the participants' eyes. The angular gaze deviation goes down to less than 1 degrees for the panels towards which an accurate eye gaze was required according to the use cases.
引用
收藏
页数:10
相关论文
共 11 条
[1]  
[Anonymous], TOBII PROSPECTRUM
[2]  
Dehais F., 2008, P 3 INT C RES AIR TR
[3]   "Automation Surprise" in Aviation: Real-Time Solutions [J].
Dehais, Frederic ;
Peysakhovich, Vsevolod ;
Scannella, Sebastien ;
Fongue, Jennifer ;
Gateau, Thibault .
CHI 2015: PROCEEDINGS OF THE 33RD ANNUAL CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS, 2015, :2525-2534
[4]   A Review and Analysis of Eye-Gaze Estimation Systems, Algorithms and Performance Evaluation Methods in Consumer Platforms [J].
Kar, Anuradha ;
Corcoran, Peter .
IEEE ACCESS, 2017, 5 :16495-16519
[5]  
Landry SJ, 2012, HUM FACTORS ERGON, P771
[6]   Gaze-based interactions in the cockpit of the future: a survey [J].
Rudi, David ;
Kiefer, Peter ;
Giannopoulos, Ioannis ;
Raubal, Martin .
JOURNAL ON MULTIMODAL USER INTERFACES, 2020, 14 (01) :25-48
[7]   The instructor assistant system (iASSYST) utilizing eye tracking for commercial aviation training purposes [J].
Rudi, David ;
Kiefer, Peter ;
Raubal, Martin .
ERGONOMICS, 2020, 63 (01) :61-79
[8]  
Salvucci D. D., 2000, P 2000 S EYE TRACK R, P71, DOI DOI 10.1145/355017.355028
[9]  
SR Research`, EYELINK 1000 PLUS
[10]  
Wu Zhengyang, 2019, ARXIV ABS190809060