A robotic facial expression recognition system using real-time vision system

被引:0
作者
Department of Electrical and Control Engineering, National Chiao Tung University, 1001 Ta Hsueh Road, Hsinchu, 300, Taiwan [1 ]
不详 [2 ]
机构
[1] Department of Electrical and Control Engineering, National Chiao Tung University, Hsinchu, 300
[2] Mechanical and Systems Research Laboratories, Industrial Technology Research Institute, Chutung, Hsinchu, 310, 195, Chung Hsing Road
来源
Key Eng Mat | 2008年 / 375-378期
关键词
DSP-based embedded system; Facial expression; Human-robot interaction; Optical metrology and image processing; Visual tracking;
D O I
10.4028/www.scientific.net/kem.381-382.375
中图分类号
学科分类号
摘要
The capability of recognizing human facial expression plays an important role in advanced human-robot interaction development. Through recognizing facial expressions, a robot can interact with a user in a more natural and friendly manner. In this paper, we proposed a facial expression recognition system based on an embedded image processing platform to classify different facial expressions on-line in real time. A low-cost embedded vision system has been designed and realized for robotic applications using a CMOS image sensor and digital signal processor (DSP). The current design acquires thirty 640×480 image frames per second (30 fps). The proposed emotion recognition algorithm has been successfully implemented on the real-time vision system. Experimental results on a pet robot show that the robot can interact with a person in a responding manner. The developed image processing platform is effective for accelerating the recognition speed to 25 recognitions per second with an average on-line recognition rate of 74.4% for five facial expressions.
引用
收藏
页码:375 / 378
页数:3
相关论文
共 10 条
[1]  
Fujita M., Proc. of the IEEE, 92, 11, pp. 1804-1813, (2004)
[2]  
Ekman P., Friesen W.V., The Facial Action Coding System, (1978)
[3]  
Buciu I., Kotropoulos C., Pitas I., Proc. of 2003 International Conference on ImageProcessing, 3, pp. 855-858, (2003)
[4]  
Ye J., Zhan Y., Song S., Proc. of IEEE International Conference on System, Man andCybernetics, 3, pp. 2215-2219, (2004)
[5]  
Stathopoulou I.O., Tsihrintzis G.A., Proc. of IEEE International Conference on System, Manand Cybernetics, 1, pp. 666-671, (2004)
[6]  
Manglik P.K., Misra U., Prashant and H.B. Maringanti, Proc. of IEEE International Conferenceon System, Man and Cybernetics, 3, pp. 2220-2224, (2004)
[7]  
ICM205B Datasheet, (2002)
[8]  
Andrian H., Song K.-T., Proc. of IEEE/RSJ International Conference on IntelligentRobots and Systems, pp. 3694-3699, (2005)
[9]  
Chou C.M., Real-Time Face Tracking Under Illumination Variation, (2005)
[10]  
Lai J.H., Yuen P.C., Chen W.S., Lao S., Kawade M., Proc. of IEEE ICCV Workshop on Recognition, Analysis and Tracking of Faces and Gestures in Real-time Systems, pp. 168-174, (2001)