Content-Based Video Emotion Tagging Augmented by Users' Multiple Physiological Responses

被引:13
作者
Wang, Shangfei [1 ]
Chen, Shiyu [1 ]
Ji, Qiang [2 ]
机构
[1] Univ Sci & Technol China, Sch Comp Sci & Technol, Hefei 230027, Anhui, Peoples R China
[2] Rensselaer Polytech Inst, Dept Elect Comp & Syst Engn, Troy, NY 12180 USA
基金
美国国家科学基金会;
关键词
Video emotion tagging; EEG signals; peripheral physiological signals; support vector machine; RECOGNITION;
D O I
10.1109/TAFFC.2017.2702749
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The intrinsic interactions among a video's emotion tag, its content, and a user's spontaneous responses while consuming the video can be leveraged to improve video emotion tagging, but such interactions have not been thoroughly exploited yet. In this paper, we propose a novel content-based video emotion tagging approach augmented by users' multiple physiological responses, which are only required during training. Specifically, a better emotion tagging model is constructed by introducing similarity constraints on the classifiers from video content and multiple physiological signals available during training. Maximum margin classifiers are adopted and efficient learning algorithms of the proposed model are also developed. Furthermore, the proposed video emotion tagging approach is extended to utilize incomplete physiological signals, since these signals are often corrupted by artifacts. Experiments on four benchmark databases demonstrate the effectiveness of the proposed method for implicitly integrating multiple physiological responses, and its superior performance to existing methods using both complete and incomplete multiple physiological signals.
引用
收藏
页码:155 / 166
页数:12
相关论文
共 24 条
[1]  
[Anonymous], THESIS
[2]  
[Anonymous], 2015, P 1 INT WORKSH AFF S
[3]  
Chen SY, 2016, INT C PATT RECOG, P295, DOI 10.1109/ICPR.2016.7899649
[4]   Approximate statistical tests for comparing supervised classification learning algorithms [J].
Dietterich, TG .
NEURAL COMPUTATION, 1998, 10 (07) :1895-1923
[5]  
Farquhar J.D. R., 2005, NIPS, DOI DOI 10.5555/2976248.2976293
[6]  
Han J., 2015, IEEE T AFFECT COMPUT, V6, P347
[7]   Affective video content representation and modeling [J].
Hanjalic, A ;
Xu, LQ .
IEEE TRANSACTIONS ON MULTIMEDIA, 2005, 7 (01) :143-154
[8]   Emotion Recognition Based on Physiological Changes in Music Listening [J].
Kim, Jonghwa ;
Andre, Elisabeth .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2008, 30 (12) :2067-2083
[9]  
Koelstra S., 2009, P INT C AFF COMP INT, P104
[10]   DEAP: A Database for Emotion Analysis Using Physiological Signals [J].
Koelstra, Sander ;
Muhl, Christian ;
Soleymani, Mohammad ;
Lee, Jong-Seok ;
Yazdani, Ashkan ;
Ebrahimi, Touradj ;
Pun, Thierry ;
Nijholt, Anton ;
Patras, Ioannis .
IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, 2012, 3 (01) :18-31