Active Labeling of Facial Feature Points

被引:0
作者
He, Menghua [1 ]
Wang, Shangfei [1 ]
Ji, Qiang [2 ]
机构
[1] Univ Sci & Technol China, Sch Comp Sci & Technol, Hefei, Anhui, Peoples R China
[2] Rensselaer Polytech Inst, Dept ECSE, Troy, NY USA
来源
2013 HUMAINE ASSOCIATION CONFERENCE ON AFFECTIVE COMPUTING AND INTELLIGENT INTERACTION (ACII) | 2013年
关键词
feature points; Bayesian Network; active labeling; mutual information; RECOGNITION; MODELS;
D O I
10.1109/ACII.2013.16
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Although considerable progress has been made in the field of facial feature point detection and tracking, accurate feature point tracking is still very challenging. Manually feature point labeling and correction are time consuming and labor intensive. To alleviate this problem, an active feature point labeling method is proposed in this paper. First, the spatial relations among feature points are modeled by a Bayesian Network. Second, the mutual information between a feature point and the remaining feature points is calculated in two steps: in the first step, to identify the most informative facial region, the mutual information between one facial sub-region and the other sub-regions is calculated; in the second step, the mutual information between one feature point and the other feature points in the most informative facial sub-region is established to rank the facial feature points. Users provide labels of the feature points according to their mutual information in descending order. After that, the human corrections and the image measurements are integrated by the Bayesian Network to produce the refined annotations. Simulative experiments on the extended Cohn-Kanade (CK+) database demonstrate the effectiveness of our approach.
引用
收藏
页码:55 / 60
页数:6
相关论文
共 18 条
  • [11] Robust Facial Feature Tracking using Selected Multi-Resolution Linear Predictors
    Ong, Eng-Jon
    Lan, Yuxuan
    Theobald, Barry
    Harvey, Richard
    Bowden, Richard
    [J]. 2009 IEEE 12TH INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2009, : 1483 - 1490
  • [12] Particle filtering with factorized likelihoods for tracking facial features
    Patras, I
    Pantic, M
    [J]. SIXTH IEEE INTERNATIONAL CONFERENCE ON AUTOMATIC FACE AND GESTURE RECOGNITION, PROCEEDINGS, 2004, : 97 - 102
  • [13] Recognizing action units for facial expression analysis
    Tian, YI
    Kanade, T
    Cohn, JF
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2001, 23 (02) : 97 - 115
  • [14] Robust facial feature tracking under varying face pose and facial expression
    Tong, Yan
    Wang, Yang
    Zhu, Zhiwei
    Ji, Qiang
    [J]. PATTERN RECOGNITION, 2007, 40 (11) : 3195 - 3208
  • [15] Facial action unit recognition by exploiting their dynamic and semantic relationships
    Tong, Yan
    Liao, Wenhui
    Ji, Qiang
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2007, 29 (10) : 1683 - 1699
  • [16] Fully Automatic Recognition of the Temporal Phases of Facial Actions
    Valstar, Michel F.
    Pantic, Maja
    [J]. IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICS, 2012, 42 (01): : 28 - 43
  • [17] A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions
    Zeng, Zhihong
    Pantic, Maja
    Roisman, Glenn I.
    Huang, Thomas S.
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2009, 31 (01) : 39 - 58
  • [18] Zhang L, 2008, LECT NOTES COMPUT SC, V5303, P706, DOI 10.1007/978-3-540-88688-4_52