Observer-generated maps of diagnostic facial features enable categorization and prediction of emotion expressions

被引:0
作者
Wegrzyn, Martin [1 ]
Muenst, Laura [1 ]
Konig, Jessica [1 ]
Dinter, Maximilian [1 ]
Kissler, Johanna [1 ,2 ]
机构
[1] Bielefeld Univ, Dept Psychol, Bielefeld, Germany
[2] Ctr Cognit Interact Technol, Bielefeld, Germany
关键词
Facial expressions; Categorization; Emotion recognition; Face perception; Prediction; RECOGNITION; INFORMATION; FACE; EYES; FEAR; REPRESENTATIONS; REVEAL;
D O I
10.1016/j.actpsy.2024.104569
中图分类号
B84 [心理学];
学科分类号
04 ; 0402 ;
摘要
According to one prominent model, facial expressions of emotion can be categorized into depicting happiness, disgust, anger, sadness, fear and surprise. One open question is which facial features observers use to recognize the different expressions and whether the features indicated by observers can be used to predict which expression they saw. We created fine-grained maps of diagnostic facial features by asking participants to use mouse clicks to highlight those parts of a face that they deem useful for recognizing its expression. We tested how well the resulting maps align with models of emotion expressions (based on Action Units) and how the maps relate to the accuracy with which observers recognize full or partly masked faces. As expected, observers focused on the eyes and mouth regions in all faces. However, each expression deviated from this global pattern in a unique way, allowing to create maps of diagnostic face regions. Action Units considered most important for expressing an emotion were highlighted most often, indicating their psychological validity. The maps of facial features also allowed to correctly predict which expression a participant had seen, with above-chance accuracies for all expressions. For happiness, fear and anger, the face half which was highlighted the most was also the half whose visibility led to higher recognition accuracies. The results suggest that diagnostic facial features are distributed in unique patterns for each expression, which observers seem to intuitively extract and use when categorizing facial displays of emotion.
引用
收藏
页数:10
相关论文
共 54 条
[1]   A mechanism for impaired fear recognition after amygdala damage [J].
Adolphs, R ;
Gosselin, F ;
Buchanan, TW ;
Tranel, D ;
Schyns, P ;
Damasio, AR .
NATURE, 2005, 433 (7021) :68-72
[2]  
[Anonymous], 2002, Manual and investigators' guide
[3]  
[Anonymous], 1978, Facial Action Coding System
[4]   The structure of current affect: Controversies and emerging consensus [J].
Barrett, LF ;
Russell, AA .
CURRENT DIRECTIONS IN PSYCHOLOGICAL SCIENCE, 1999, 8 (01) :10-14
[5]   Emotional Expressions Reconsidered: Challenges to Inferring Emotion From Human Facial Movements [J].
Barrett, Lisa Feldman ;
Adolphs, Ralph ;
Marsella, Stacy ;
Martinez, Aleix M. ;
Pollak, Seth D. .
PSYCHOLOGICAL SCIENCE IN THE PUBLIC INTEREST, 2019, 20 (01) :1-68
[6]   Eye Fixation Patterns for Categorizing Static and Dynamic Facial Expressions [J].
Blais, Caroline ;
Fiset, Daniel ;
Roy, Cynthia ;
Regimbald, Camille Saumure ;
Gosselin, Frederic .
EMOTION, 2017, 17 (07) :1107-1119
[7]   The eyes are not the window to basic emotions [J].
Blais, Caroline ;
Roy, Cynthia ;
Fiset, Daniel ;
Arguin, Martin ;
Gosselin, Frederic .
NEUROPSYCHOLOGIA, 2012, 50 (12) :2830-2838
[8]   Emotion recognition: The role of featural and configural face information [J].
Bombari, Dario ;
Schmid, Petra C. ;
Mast, Marianne Schmid ;
Birri, Sandra ;
Mast, Fred W. ;
Lobmaier, Janek S. .
QUARTERLY JOURNAL OF EXPERIMENTAL PSYCHOLOGY, 2013, 66 (12) :2426-2442
[9]  
Boukricha Hana, 2009, 2009 3 INT C AFF COM, P1
[10]   Visualising mental representations: A primer on noise-based reverse correlation in social psychology [J].
Brinkman, L. ;
Todorov, A. ;
Dotsch, R. .
EUROPEAN REVIEW OF SOCIAL PSYCHOLOGY, 2017, 28 (01) :333-361