An Efficient Eye Tracking Using POMDP for Robust Human Computer Interaction

被引:0
|
作者
Rhee, Ji Hye [1 ]
Sung, Won Jun [2 ]
Nam, Mi Young [3 ]
Byun, Hyeran [1 ]
Rhee, Phill Kyu [2 ]
机构
[1] Yonsei Univ, Dept Comp Sci, Seoul 120749, South Korea
[2] Inha Univ, Dept Comp Sci & Engn, Inchon, South Korea
[3] YM Naeultech, Inchon, South Korea
来源
COMPUTER VISION SYSTEMS (ICVS 2015) | 2015年 / 9163卷
关键词
Eye tracking; POMDP; Real-time Q-learning; World-context model; Image-quality analysis; FRAMEWORK;
D O I
10.1007/978-3-319-20904-3_37
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We propose an adaptive eye tracking system for robust human-computer interaction under dynamically changing environments based on the partially observable Markov Decision Process (POMDP). In our system, real-time eye tracking optimization is tackled using a flexible world-context model based POMDP approach that requires less data and time in adaptation than those of hard world-context model approaches. The challenge is to divide the huge belief space into world-context models, and to search for optimal control parameters in the current world-context model with real-time constraints. The offline learning determines multiple world-context models based on image-quality analysis over the joint space of transition, observation, reward distributions, and an approximate world-context model is balanced with the online learning over a localized horizon. The online learning is formulated as a dynamic parameter control with incomplete information under real-time constraints, and is solved by the real-time Q-learning approach. Extensive experiments conducted using realistic videos have provided us with very encouraging results.
引用
收藏
页码:415 / 423
页数:9
相关论文
共 50 条
  • [41] Automatic Visual Attention Detection for Mobile Eye Tracking Using Pre-Trained Computer Vision Models and Human Gaze
    Barz, Michael
    Sonntag, Daniel
    SENSORS, 2021, 21 (12)
  • [42] Human Movement Direction Prediction using Virtual Reality and Eye Tracking
    Pettersson, Julius
    Falkman, Petter
    2021 22ND IEEE INTERNATIONAL CONFERENCE ON INDUSTRIAL TECHNOLOGY (ICIT), 2021, : 889 - 894
  • [43] Application of Combined Brain Computer Interface and Eye Tracking
    Wang, Pengchao
    Song, Zuoting
    Chen, Hao
    Fang, Tao
    Zhang, Yuan
    Zhang, Xueze
    Wang, Shouyan
    Li, Hui
    Lin, Yifang
    Jia, Jie
    Zhang, Lihua
    Kang, Xiaoyang
    2021 9TH IEEE INTERNATIONAL WINTER CONFERENCE ON BRAIN-COMPUTER INTERFACE (BCI), 2021, : 258 - 262
  • [44] Intended Human Arm Movement Direction Prediction using Eye Tracking
    Pettersson, Julius
    Falkman, Petter
    INTERNATIONAL JOURNAL OF COMPUTER INTEGRATED MANUFACTURING, 2024, 37 (09) : 1107 - 1125
  • [45] Research on Algorithm of Eye Tracking in Identification of Human Eye
    Li Ye
    Huang Jiangjiang
    Wu Yan
    PROCEEDINGS OF THE THIRD INTERNATIONAL SYMPOSIUM ON TEST AUTOMATION & INSTRUMENTATION, VOLS 1 - 4, 2010, : 675 - 679
  • [46] PuReST: Robust Pupil Tracking for Real-Time Pervasive Eye Tracking
    Santini, Thiago
    Fuhl, Wolfgang
    Kasneci, Enkelejda
    2018 ACM SYMPOSIUM ON EYE TRACKING RESEARCH & APPLICATIONS (ETRA 2018), 2018,
  • [47] Human Movement Direction Classification using Virtual Reality and Eye Tracking
    Pettersson, Julius
    Falkman, Petter
    30TH INTERNATIONAL CONFERENCE ON FLEXIBLE AUTOMATION AND INTELLIGENT MANUFACTURING (FAIM2021), 2020, 51 : 95 - 102
  • [48] AN EFFICIENT METHOD FOR EYE TRACKING AND EYE-GAZED FOV ESTIMATION
    Li, Xiaokun
    Wee, William G.
    2009 16TH IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, VOLS 1-6, 2009, : 2597 - +
  • [49] A hierarchical approach for human eye tracking
    Zeng, ZH
    Ma, SD
    LOW-LIGHT-LEVEL AND REAL-TIME IMAGING SYSTEMS, COMPONENTS, AND APPLICATIONS, 2003, 4796 : 227 - 234
  • [50] An experimental study on relationship between foveal range and FoV of a human eye using eye tracking devices
    Adithya, B.
    Kumar, Pavan B. N.
    Lee, Hanna
    Kim, Ji Yeon
    Moon, Jae Cheol
    Chai, Young Ho
    2018 INTERNATIONAL CONFERENCE ON ELECTRONICS, INFORMATION, AND COMMUNICATION (ICEIC), 2018, : 348 - 352