Temporal aware Mixed Attention-based Convolution and Transformer Network for cross-subject EEG emotion recognition

被引:4
作者
Si, Xiaopeng [1 ]
Huang, Dong [1 ]
Liang, Zhen [2 ]
Sun, Yulin [1 ]
Huang, He [1 ]
Liu, Qile [2 ]
Yang, Zhuobin [1 ]
Ming, Dong [1 ]
机构
[1] Academy of Medical Engineering and Translational Medicine, State Key Laboratory of Advanced Medical Materials and Devices, Haihe Laboratory of Brain-computer Interaction and Human-machine Integration, Tianjin Key Laboratory of Brain Science and Neural Engi
[2] School of Biomedical Engineering, Medical School, Shenzhen University, Guangdong Provincial Key Laboratory of Biomedical Measurements and Ultrasound Imaging, Shenzhen
基金
中国国家自然科学基金;
关键词
Attention; Cross-subject; Electroencephalography; Emotion recognition; Transformer;
D O I
10.1016/j.compbiomed.2024.108973
中图分类号
学科分类号
摘要
Emotion recognition is crucial for human–computer interaction, and electroencephalography (EEG) stands out as a valuable tool for capturing and reflecting human emotions. In this study, we propose a hierarchical hybrid model called Mixed Attention-based Convolution and Transformer Network (MACTN). This model is designed to collectively capture both local and global temporal information and is inspired by insights from neuroscientific research on the temporal dynamics of emotions. First, we introduce depth-wise temporal convolution and separable convolution to extract local temporal features. Then, a self-attention-based transformer is used to integrate the sparse global emotional features. Besides, channel attention mechanism is designed to identify the most task-relevant channels, facilitating the capture of relationships between different channels and emotional states. Extensive experiments are conducted on three public datasets under both offline and online evaluation modes. In the multi-class cross-subject online evaluation using the THU-EP dataset, MACTN demonstrates an approximate 8% enhancement in 9-class emotion recognition accuracy in comparison to state-of-the-art methods. In the multi-class cross-subject offline evaluation using the DEAP and SEED datasets, a comparable performance is achieved solely based on the raw EEG signals, without the need for prior knowledge or transfer learning during the feature extraction and learning process. Furthermore, ablation studies have shown that integrating self-attention and channel-attention mechanisms improves classification performance. This method won the Emotional BCI Competition's final championship in the World Robot Contest. The source code is available at https://github.com/ThreePoundUniverse/MACTN. © 2024 Elsevier Ltd
引用
收藏
相关论文
共 62 条
[1]  
Mumenthaler C., Sander D., Manstead A.S., Emotion recognition in simulated social interactions, IEEE Trans. Affect. Comput., 11, 2, pp. 308-312, (2018)
[2]  
Vos T., Abajobir A.A., Abate K.H., Abbafati C., Abbas K.M., Abd-Allah F., Abdulkader R.S., Abdulle A.M., Abebo T.A., Abera S.F., Et al., Global, regional, and national incidence, prevalence, and years lived with disability for 328 diseases and injuries for 195 countries, 1990–2016: A systematic analysis for the Global Burden of Disease Study 2016, Lancet, 390, 10100, pp. 1211-1259, (2017)
[3]  
Wu D., Lu B., Hu B., Zeng Z., Affective Brain–Computer Interfaces (aBCIs): A tutorial, Proc. IEEE, (2023)
[4]  
Lane R.D., Ryan L., Nadel L., Greenberg L., Memory reconsolidation, emotional arousal, and the process of change in psychotherapy: new insights from brain science, Behav. Brain Sci., 38, (2015)
[5]  
Huang W., Wu W., Lucas M.V., Huang H., Wen Z., Li Y., Neurofeedback training with an electroencephalogram-based brain-computer interface enhances emotion regulation, IEEE Trans. Affect. Comput., 14, 2, pp. 998-1011, (2021)
[6]  
Alhasan H.S., Wheeler P.C., Fong D.T., Application of interactive video games as rehabilitation tools to improve postural control and risk of falls in prefrail older adults, Cyborg Bionic Syst., (2021)
[7]  
Van Gerven M., Farquhar J., Schaefer R., Vlek R., Geuze J., Nijholt A., Ramsey N., Haselager P., Vuurpijl L., Gielen S., Et al., The brain–computer interface cycle, J. Neural Eng., 6, 4, (2009)
[8]  
Alsolamy M., Fattouh A., Emotion estimation from EEG signals during listening to Quran using PSD features, 2016 7th International Conference on Computer Science and Information Technology, CSIT, pp. 1-5, (2016)
[9]  
Duan R., Zhu J., Lu B., Differential entropy feature for EEG-based emotion classification, 2013 6th International IEEE/EMBS Conference on Neural Engineering, NER, pp. 81-84, (2013)
[10]  
Chen H., Sun S., Li J., Yu R., Li N., Li X., Hu B., Personal-zscore: eliminating individual difference for EEG-based cross-subject emotion recognition, IEEE Trans. Affect. Comput., (2021)