Facial asymmetry quantification for expression invariant human identification

被引:82
作者
Liu, YX
Schmidt, KL
Cohn, JF
Mitra, S
机构
[1] Carnegie Mellon Univ, Inst Robot, Pittsburgh, PA 15213 USA
[2] Univ Pittsburgh, Dept Psychol, Pittsburgh, PA 15260 USA
[3] Carnegie Mellon Univ, Dept Stat, Pittsburgh, PA 15213 USA
基金
美国国家科学基金会;
关键词
RECOGNITION; FACE; EIGENFACES; SYMMETRY; GENDER;
D O I
10.1016/S1077-3142(03)00078-X
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We investigate facial asymmetry as a biometric under expression variation. For the first time, we have defined two types of quantified facial asymmetry measures that are easily computable from facial images and videos. Our findings show that the asymmetry measures of automatically selected facial regions capture individual differences that are relatively stable to facial expression variations. More importantly, a synergy is achieved by combining facial asymmetry information with conventional EigenFace and FisherFace methods. We have assessed the generality of these findings across two publicly available face databases: Using a random subset of 110 subjects from the FERET database, a 38% classification error reduction rate is obtained. Error reduction rates of 45-100% are achieved on 55 subjects from the Cohn-Kanade AU-Coded Facial Expression Database. These results suggest that facial asymmetry may provide complementary discriminative information to human identification methods, which has been missing in automatic human identification. (C) 2003 Elsevier Inc. All rights reserved.
引用
收藏
页码:138 / 159
页数:22
相关论文
共 44 条