Cross-Subject Emotion Recognition Using Deep Adaptation Networks

被引:120
作者
Li, He [1 ]
Jin, Yi-Ming [1 ]
Zheng, Wei-Long [1 ]
Lu, Bao-Liang [1 ,2 ,3 ]
机构
[1] Shanghai Jiao Tong Univ, Ctr Brain Comp & Machine Intelligence, Dept Comp Sci & Engn, 800 Dong Chuan Rd, Shanghai 200240, Peoples R China
[2] Shanghai Jiao Tong Univ, Key Lab Shanghai Educ Commiss Intelligent Interac, 800 Dong Chuan Rd, Shanghai 200240, Peoples R China
[3] Shanghai Jiao Tong Univ, Brain Sci & Technol Res Ctr, 800 Dong Chuan Rd, Shanghai 200240, Peoples R China
来源
NEURAL INFORMATION PROCESSING (ICONIP 2018), PT V | 2018年 / 11305卷
基金
中国国家自然科学基金;
关键词
Affective brain-computer interface; Emotion recognition; EEG; Deep neural network; Domain adaptation;
D O I
10.1007/978-3-030-04221-9_36
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Affective models based on EEG signals have been proposed in recent years. However, most of these models require subject-specific training and generalize worse when they are applied to new subjects. This is mainly caused by the individual differences across subjects. While, on the other hand, it is time-consuming and high cost to collect subject-specific training data for every new user. How to eliminate the individual differences in EEG signals for implementation of affective models is one of the challenges. In this paper, we apply Deep adaptation network (DAN) to solve this problem. The performance is evaluated on two publicly available EEG emotion recognition datasets, SEED and SEED-IV, in comparison with two baseline methods without domain adaptation and several other domain adaptation methods. The experimental results indicate that the performance of DAN is significantly superior to the existing methods.
引用
收藏
页码:403 / 413
页数:11
相关论文
共 21 条
[1]  
[Anonymous], P 32 INT C MACH LEAR
[2]   Multi-subject subspace alignment for non-stationary EEG-based emotion recognition [J].
Chai, Xin ;
Wang, Qisong ;
Zhao, Yongping ;
Liu, Xin ;
Liu, Dan ;
Bai, Ou .
TECHNOLOGY AND HEALTH CARE, 2018, 26 :S327-S335
[3]   A Fast, Efficient Domain Adaptation Technique for Cross-Domain Electroencephalography(EEG)-Based Emotion Recognition [J].
Chai, Xin ;
Wang, Qisong ;
Zhao, Yongping ;
Li, Yongqiang ;
Liu, Dan ;
Liu, Xin ;
Bai, Ou .
SENSORS, 2017, 17 (05)
[4]   Unsupervised domain adaptation techniques based on auto-encoder for non-stationary EEG-based emotion recognition [J].
Chai, Xin ;
Wang, Qisong ;
Zhao, Yongping ;
Liu, Xin ;
Bai, Ou ;
Li, Yongqiang .
COMPUTERS IN BIOLOGY AND MEDICINE, 2016, 79 :205-214
[5]  
Duan RN, 2013, I IEEE EMBS C NEUR E, P81, DOI 10.1109/NER.2013.6695876
[6]  
Ganin Y, 2015, PR MACH LEARN RES, V37, P1180
[7]  
Jin YM, 2017, INT CONF ORANGE TECH, P222, DOI 10.1109/ICOT.2017.8336126
[8]   Gender Differences in Implicit and Explicit Processing of Emotional Facial Expressions as Revealed by Event-Related Theta Synchronization [J].
Knyazev, Gennady G. ;
Slobodskoj-Plusnin, Jaroslav Y. ;
Bocharov, Andrey V. .
EMOTION, 2010, 10 (05) :678-687
[9]  
Lan Z., 2018, IEEE T COGN DEV SYST, V1
[10]   Improving EEG-Based Emotion Classification Using Conditional Transfer Learning [J].
Lin, Yuan-Pin ;
Jung, Tzyy-Ping .
FRONTIERS IN HUMAN NEUROSCIENCE, 2017, 11