Identification of eye movements from non-frontal face images for eye-controlled systems

被引:4
作者
Lin M. [1 ,2 ]
Li B. [1 ]
Liu Q.-H. [1 ]
机构
[1] School of Mechatronic Engineering and Automation, Shanghai University, Shanghai
[2] Shanghai Medical Instrumentation College, University of Shanghai Science and Technology, Shanghai
关键词
Eye movement identification; eye-based system; human-computer interaction; non-frontal face images; pupil center localization;
D O I
10.1007/s11633-014-0827-0
中图分类号
学科分类号
摘要
The novel eye-based human-computer interaction (HCI) system aims to provide people, especially, disabled persons, a new way of communication with surroundings. It adopts a series of continual eye movements as input to perform simple control activities. Identification of eye movements is the crucial technology in these eye-based HCI systems. At present, researches on eye movement identification mainly focus on frontal face images. In fact, acquisition of non-frontal face images is more reasonable in real applications. In this paper, we discuss the identification process of eye movements from non-frontal face images. Firstly, the original head-shoulder images of 0° − ±60° azimuths are sampled without any auxiliary light source. Secondly, the non-frontal face region is detected by using the Adaboost cascade classifiers. After that, we roughly extract eye windows by the integral projection function. Then, we propose a new method to calculate the x-y coordinates of the pupil center point by searching the minimal intensity value in the eye windows. According to the trajectory of the pupil center points, different eye movements (eye moving left, right, up or down) are successfully identified. A set of experiments is presented. © 2014, Institute of Automation, Chinese Academy of Sciences and Springer-Verlag Berlin Heidelberg.
引用
收藏
页码:543 / 554
页数:11
相关论文
共 28 条
[1]  
Fu K.C.D., Nakamura Y., Yamamoto T., Ishiguro H., Analysis of motor synergies utilization for optimal movement generation for a human-like robotic arm, International Journal of Automation and Computing, 10, 6, pp. 515-524, (2013)
[2]  
Chao F., Zhang X., Lin H.X., Zhou C.L., Jiang M., Learning robotic hand-eye coordination through a developmental constraint driven approach, International Journal of Automation and Computing, 10, 5, pp. 414-424, (2013)
[3]  
Hammoud R.I., Mulligan J.B., Introduction to eye monitoring, Passive Eye Monitoring: Algorithms, Applications and Experiments (Signals and Communications Technology), pp. 1-19, (2008)
[4]  
Venkataramanan S., Prabhat P., Choudhury S.R., Nemade H.B., Sahambi J.S., Biomedical instrumentation based on electrooculogram (EOG) signal processing and application to a hospital alarm system, Proceedings of International Conference on Intelligent Sensing and Information Processing, pp. 535-540, (2005)
[5]  
Deng L.Y., Hsu C.L., Lin T.C., Tuan J.S., Chang S.M., EOG-based human-computer interface system development, Expert Systems with Applications, 37, 4, pp. 3337-3343, (2010)
[6]  
Deng L.Y., Hsu C.L., Lin T.C., Tuan J.S., Chen Y.H., EOG-based signal detection and verification for HCI, Proceedings of the 8th International Conference on Machine Learning and Cybernetics, pp. 3342-3348, (2009)
[7]  
Beymer D., Flickner M., Eye gaze tracking using an active stereo head, Proceedings of Computer Vision and Pattern Recognition, pp. 451-458, (2003)
[8]  
Hutchinson T.E., Jr K.P.W., Martin W.N., Reichert K.C., Frey L.A., Human-computer interaction using eye-gaze input, IEEE Transactions on Systems, Man, and Cybernetics, 19, 6, pp. 1527-1534, (1989)
[9]  
Betke M., Gips J., Fleming P., The camera mouse: Visual tracking of body features to provide computer access for people with severe disabilities, IEEE Transactions on Neural Systems and Rehabilitation Engineering, 10, 1, pp. 1-10, (2002)
[10]  
Zhang C., Chi J.N., Zhang Z.H., Gao X.L., Gaze estimation in a gaze tracking system, Science China Information Sciences, 54, 11, pp. 2295-2306, (2011)