Using Agreement on Direction of Change to Build Rank-Based Emotion Classifiers

被引:25
作者
Parthasarathy, Srinivas [1 ]
Cowie, Roddy [2 ]
Busso, Carlos [1 ]
机构
[1] Univ Texas Dallas, Erik Jonsson Sch Engn & Comp Sci, Richardson, TX 75080 USA
[2] Queens Univ, Sch Psychol, Belfast BT7 1NN, Antrim, North Ireland
基金
美国国家科学基金会;
关键词
Emotion recognition; rank-based emotion recognition; relative emotional labels; time-continuous emotional descriptors; RECOGNITION; REGRESSION;
D O I
10.1109/TASLP.2016.2593944
中图分类号
O42 [声学];
学科分类号
070206 ; 082403 ;
摘要
Automatic emotion recognition in realistic domains is a challenging task given the subtle expressive behaviors that occur during human interactions. The challenges start with noisy emotional descriptors provided by multiple evaluators, which are characterized by low interevaluator agreement. Studies have suggested that evaluators are more consistent in detecting qualitative relations between episodes (i.e., emotional contrasts), rather than absolute scores (i.e., the actual emotion). Based on these observations, this study explores the use of relative labels to train machine learning algorithms that can rank expressive behaviors. Instead of deriving relative labels from expensive and time-consuming subjective evaluations, the labels are extracted from existing time-continuous evaluations over expressive attributes annotated with FEELTRACE. We rely on the qualitative agreement (QA) analysis to estimate relative labels which are used to train rank-based classifiers (rankers). The experimental evaluation on the SEMAINE database demonstrates the benefits of the proposed approach. The ranking performance using the QA-based labels compare favorably against preference learning rankers trained with relative labels obtained by simply aggregating the absolute values of the emotional traces across evaluators, which is the common approach used by other studies.
引用
收藏
页码:2108 / 2121
页数:14
相关论文
共 59 条
[1]  
Anderson N.H., 1982, Methods of information integration theory
[2]  
[Anonymous], ELEMENTS PSYCHOPHYSI
[3]  
[Anonymous], 2006, LREC
[4]  
[Anonymous], 2004, 6 INT C MULTIMODAL I
[5]  
[Anonymous], 2010, Blueprint for affective computing: A sourcebook
[6]  
[Anonymous], ROLE PROSODY AFFECTI
[7]  
[Anonymous], P INTERSPEECH
[8]  
[Anonymous], 2011, IEEE T AFFECTIVE COM
[9]  
[Anonymous], 2012, P INTERSPEECH
[10]  
[Anonymous], IEEE T AFFE IN PRESS