Infant AFAR: Automated facial action recognition in infants

被引:16
作者
Onal Ertugrul, Itir [1 ]
Ahn, Yeojin Amy [2 ]
Bilalpur, Maneesh [3 ]
Messinger, Daniel S. [2 ]
Speltz, Matthew L. [4 ]
Cohn, Jeffrey F. [3 ]
机构
[1] Univ Utrecht, Utrecht, Netherlands
[2] Univ Miami, Miami, FL USA
[3] Univ Pittsburgh, Pittsburgh, PA USA
[4] Univ Washington, Seattle, WA 98195 USA
基金
美国国家科学基金会; 美国国家卫生研究院;
关键词
Automatic facial action unit detection; Facial action coding system; Infant behavior; Cross domain generalizability; Deep learning; CRANIOFACIAL MICROSOMIA; EXPRESSIONS; FACE; MODELS;
D O I
10.3758/s13428-022-01863-y
中图分类号
B841 [心理学研究方法];
学科分类号
040201 ;
摘要
Automated detection of facial action units in infants is challenging. Infant faces have different proportions, less texture, fewer wrinkles and furrows, and unique facial actions relative to adults. For these and related reasons, action unit (AU) detectors that are trained on adult faces may generalize poorly to infant faces. To train and test AU detectors for infant faces, we trained convolutional neural networks (CNN) in adult video databases and fine-tuned these networks in two large, manually annotated, infant video databases that differ in context, head pose, illumination, video resolution, and infant age. AUs were those central to expression of positive and negative emotion. AU detectors trained in infants greatly outperformed ones trained previously in adults. Training AU detectors across infant databases afforded greater robustness to between-database differences than did training database specific AU detectors and outperformed previous state-of-the-art in infant AU detection. The resulting AU detection system, which we refer to as Infant AFAR (Automated Facial Action Recognition), is available to the research community for further testing and applications in infant emotion, social interaction, and related topics.
引用
收藏
页码:1024 / 1035
页数:12
相关论文
共 60 条
[1]   The still face: A history of a shared experimental paradigm [J].
Adamson, LB ;
Frick, JE .
INFANCY, 2003, 4 (04) :451-473
[2]   OpenFace 2.0: Facial Behavior Analysis Toolkit [J].
Baltrusaitis, Tadas ;
Zadeh, Amir ;
Lim, Yao Chong ;
Morency, Louis-Philippe .
PROCEEDINGS 2018 13TH IEEE INTERNATIONAL CONFERENCE ON AUTOMATIC FACE & GESTURE RECOGNITION (FG 2018), 2018, :59-66
[3]  
Bansal S., 2019, P 2019 C N AM CHAPT, V1, P5868
[4]   Emotional Expressions Reconsidered: Challenges to Inferring Emotion From Human Facial Movements [J].
Barrett, Lisa Feldman ;
Adolphs, Ralph ;
Marsella, Stacy ;
Martinez, Aleix M. ;
Pollak, Seth D. .
PSYCHOLOGICAL SCIENCE IN THE PUBLIC INTEREST, 2019, 20 (01) :1-68
[5]   Mother-Infant Face-to-Face Intermodal Discrepancy and Risk [J].
Beebe, Beatrice .
COMPANION PUBLICATON OF THE 2020 INTERNATIONAL CONFERENCE ON MULTIMODAL INTERACTION (ICMI '20 COMPANION), 2020, :365-369
[6]   How does microanalysis of mother-infant communication inform maternal sensitivity and infant attachment? [J].
Beebe, Beatrice ;
Steele, Miriam .
ATTACHMENT & HUMAN DEVELOPMENT, 2013, 15 (5-6) :583-602
[7]   COEFFICIENT KAPPA - SOME USES, MISUSES, AND ALTERNATIVES [J].
BRENNAN, RL ;
PREDIGER, DJ .
EDUCATIONAL AND PSYCHOLOGICAL MEASUREMENT, 1981, 41 (03) :687-699
[8]   A tutorial on Support Vector Machines for pattern recognition [J].
Burges, CJC .
DATA MINING AND KNOWLEDGE DISCOVERY, 1998, 2 (02) :121-167
[9]   DEPRESSION IN FIRST-TIME MOTHERS - MOTHER-INFANT INTERACTION AND DEPRESSION CHRONICITY [J].
CAMPBELL, SB ;
COHN, JF ;
MEYERS, T .
DEVELOPMENTAL PSYCHOLOGY, 1995, 31 (03) :349-357
[10]   EXPRESSIVE DEVELOPMENT AND BASIC EMOTIONS [J].
CAMRAS, LA .
COGNITION & EMOTION, 1992, 6 (3-4) :269-283