Infant AFAR: Automated facial action recognition in infants

被引:16
作者
Onal Ertugrul, Itir [1 ]
Ahn, Yeojin Amy [2 ]
Bilalpur, Maneesh [3 ]
Messinger, Daniel S. [2 ]
Speltz, Matthew L. [4 ]
Cohn, Jeffrey F. [3 ]
机构
[1] Univ Utrecht, Utrecht, Netherlands
[2] Univ Miami, Miami, FL USA
[3] Univ Pittsburgh, Pittsburgh, PA USA
[4] Univ Washington, Seattle, WA 98195 USA
基金
美国国家科学基金会; 美国国家卫生研究院;
关键词
Automatic facial action unit detection; Facial action coding system; Infant behavior; Cross domain generalizability; Deep learning; CRANIOFACIAL MICROSOMIA; EXPRESSIONS; FACE; MODELS;
D O I
10.3758/s13428-022-01863-y
中图分类号
B841 [心理学研究方法];
学科分类号
040201 ;
摘要
Automated detection of facial action units in infants is challenging. Infant faces have different proportions, less texture, fewer wrinkles and furrows, and unique facial actions relative to adults. For these and related reasons, action unit (AU) detectors that are trained on adult faces may generalize poorly to infant faces. To train and test AU detectors for infant faces, we trained convolutional neural networks (CNN) in adult video databases and fine-tuned these networks in two large, manually annotated, infant video databases that differ in context, head pose, illumination, video resolution, and infant age. AUs were those central to expression of positive and negative emotion. AU detectors trained in infants greatly outperformed ones trained previously in adults. Training AU detectors across infant databases afforded greater robustness to between-database differences than did training database specific AU detectors and outperformed previous state-of-the-art in infant AU detection. The resulting AU detection system, which we refer to as Infant AFAR (Automated Facial Action Recognition), is available to the research community for further testing and applications in infant emotion, social interaction, and related topics.
引用
收藏
页码:1024 / 1035
页数:12
相关论文
共 60 条
[21]  
Eibl-Eibesfeldt I., 1970, ETHOLOGY, P530
[22]  
Ekman P., 2002, Facial action coding system (FACS): manual
[23]  
Ertugrul Itir Onal, 2020, IEEE Trans Biom Behav Identity Sci, V2, P158, DOI [10.1109/TBIOM.2020.2977225, 10.1109/tbiom.2020.2977225]
[24]  
Ertugrul Itir Onal, 2019, Proc Int Conf Autom Face Gesture Recognit, V2019, DOI 10.1109/FG.2019.8756623
[25]   The Relationship Between Infant Facial Expressions and Food Acceptance [J].
Forestell C.A. ;
Mennella J.A. .
Current Nutrition Reports, 2017, 6 (2) :141-147
[26]  
Girard Jeffrey M., 2015, 2015 11th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition (FG), P1, DOI 10.1109/FG.2015.7163106
[27]   Sayette Group Formation Task (GFT) Spontaneous Facial Expression Database [J].
Girard, Jeffrey M. ;
Chu, Wen-Sheng ;
Jeni, Laszlo A. ;
Cohn, Jeffrey F. ;
De la Torre, Fernando ;
Sayette, Michael A. .
2017 12TH IEEE INTERNATIONAL CONFERENCE ON AUTOMATIC FACE AND GESTURE RECOGNITION (FG 2017), 2017, :581-588
[28]  
Goldsmith H.Hill., 1999, The Laboratory Temperament Assessment Battery: Preschool Version
[29]   Facial Expressiveness in Infants With and Without Craniofacial Microsomia: Preliminary Findings [J].
Hammal, Zakia ;
Cohn, Jeffrey F. ;
Wallace, Erin R. ;
Heike, Carrie L. ;
Birgfeld, Craig B. ;
Oster, Harriet ;
Speltz, Matthew L. .
CLEFT PALATE CRANIOFACIAL JOURNAL, 2018, 55 (05) :711-720
[30]  
Hammal Z, 2017, INT CONF AFFECT, P216, DOI 10.1109/ACII.2017.8273603