Building Cross-Subject EEG-Based Affective Models Using Heterogeneous Transfer Learning

被引:0
作者
Zheng W.-L. [1 ,2 ,3 ,4 ]
Shi Z.-F. [1 ]
Lv B.-L. [1 ,2 ,3 ]
机构
[1] Center for Brain-like Computing and Machine Intelligence, Department of Computer Science and Engineering, Shanghai Jiao Tong University, Shanghai
[2] Key Laboratory of Shanghai Education Commission for Intelligent Interaction and Cognitive Engineering, Shanghai Jiao Tong University, Shanghai
[3] Brain Science and Technology Research Center, Shanghai Jiao Tong University, Shanghai
[4] Department of Neurology, Massachusetts General Hospital, Harvard Medical School, Boston
来源
Jisuanji Xuebao/Chinese Journal of Computers | 2020年 / 43卷 / 02期
基金
中国国家自然科学基金;
关键词
Affective brain-computer interface; Cross-subject affective models; EEG signal; Eye movement signal; Multimodal emotion recognition; Scanpath; Transfer learning;
D O I
10.11897/SP.J.1016.2020.00177
中图分类号
学科分类号
摘要
Developing robust cross-subject EEG-based affective models is one of the key problems in affective brain-computer interfaces due to the challenges of individual differences and non-stationarity of EEG for building satisfactory affective models. In recent studies, transfer learning has been successfully applied to reducing the differences of feature distributions between source and target subjects. However, it is still necessary for users to acquire a moderately large amount of unlabeled EEG data from target subjects. Unlike the existing transfer learning frameworks solely on EEG, we propose an alternative approach to applying eye tracking data for calibration. In this paper, we propose a novel approach to leveraging heterogeneous knowledge from spatiotemporal scanpath patterns to enhance the performance of cross-subject EEG-based affective models. The main ideas behind our approach are that what and where subjects are watching would elicit their specific neural activities in the brains and such useful information would provide important clues to emotion recognition. We introduce heterogeneous transfer learning to use the modified transductive parameter transfer (TPT) framework. The TPT approach consists of three main steps. First, multiple individual classifiers are learned on each training dataset of source subjects. Second, a regression function is trained to learn the relation between the data distributions and classifiers parameter vectors. Finally, target classifier is obtained using the target feature distribution and the distribution-to-classifier mapping. To quantify the domain discrepancy across subjects, the scanpath sequences under the same film clips from different subjects are encoded and compared as a measurement of domain discrepancy for subject transfer by using dynamic time warping. The scanpath patterns are used as guidance for what knowledge should be transferred and how to transfer relevant affective information. We calculate scanpath-based kernels and EEG-based kernels and construct the cross-subject affective models with the transductive parameter transfer algorithm. The proposed approach has two advantages in comparison with the existing approaches: 1) the easily accessed eye tracking data from target subjects is utilized for subject transfer; 2) with only eye tracking data for target subjects, the proposed heterogeneous knowledge transfer approach can utilize the discriminative properties of EEG from other subjects. For conventional framework, the calibration phase requires recording moderately a large amount of labeled EEG data. In our framework, the calibration phase for new subjects can record only eye tracking data instead and transfer discriminative information of EEG from potential source subjects. It is feasible in some scenarios where collecting eye tracking data is much easier, while adaptive models are still able to recycle the representational capacity of EEG recorded previously for emotion recognition taking the advantage of high performance of EEG. The performance of our proposed approach and the state-of-the-art subject transfer methods are evaluated on an EEG and eye tracking dataset of three affective states (positive, neutral, and negative). The experiential results demonstrate that the scanpath-based transfer models achieve comparative performance in comparison with the EEG-based models and the scanpath-based subject transfer models obtain the mean accuracy of 69.72% in comparison of the conventional generic classifiers with 50.46% in average. These experimental results demonstrate the effectiveness of our proposed approach. © 2020, Science Press. All right reserved.
引用
收藏
页码:177 / 189
页数:12
相关论文
共 50 条
[21]  
Koelstra S., Muhl C., Soleymani M., Et al., Deap: A database for emotion analysis using physiological signals, IEEE Transactions on Affective Computing, 3, 1, pp. 18-31, (2012)
[22]  
Chen T., Helminen T.M., Hietanen J.K., Affect in the eyes: Explicit and implicit evaluations, Cognition and Emotion, 31, 6, pp. 1070-1082, (2017)
[23]  
Shi Z.F., Zhou C., Zheng W.L., Lu B.L., Attention evaluation with eye tracking glasses for EEG-based emotion recognition, Proceedings of the 8th International IEEE/EMBS Conference on Neural Engineering, pp. 86-89, (2017)
[24]  
Du C., Du C., Wang H., Et al., Semi-supervised deep generative modelling of incomplete multi-modality emotional data, Proceedings of ACM Multimedia Conference on Multimedia, pp. 108-116, (2018)
[25]  
Qiu J.L., Liu W., Lu B.L., Multi-view emotion recognition using deep canonical correlation analysis, Proceedings of the International Conference on Neural Information Processing, pp. 221-231, (2018)
[26]  
Krauledat M., Tangermann M., Blankertz B., Muller K.R., Towards zero training for brain-computer interfacing, PLOS One, 3, 8, (2008)
[27]  
Zhou J.T., Pan S.J., Tsang I.W., Yan Y., Hybrid heterogeneous transfer learning through deep learning, Proceedings of the 28th AAAI Conference on Artificial Intelligence, pp. 2213-2220, (2014)
[28]  
Eckstein M.K., Guerra-Carrillo B., Singley A.T.M., Bunge S.A., Beyond Eye Gaze: What else can Eyetracking Reveal about Cognition and Cognitive Development?, Developmental Cognitive Neuroscience, 25, pp. 69-91, (2017)
[29]  
Costa V.D., Rudebeck P.H., More than meets the eye: The relationship between pupil size and locus coeruleus activity, Neuron, 89, 1, pp. 8-10, (2016)
[30]  
Rajkowski J., Correlations between locus coeruleus (LC) neural activity, pupil diameter and behavior in monkey support a role of LC in attention, Proceedings of Society for Neuroscience Abstract, (1993)