Exploring Deep Learning Features for Automatic Classification of Human Emotion Using EEG Rhythms

被引:66
作者
Demir, Fatih [1 ]
Sobahi, Nebras [2 ]
Siuly, Siuly [3 ]
Sengur, Abdulkadir [1 ]
机构
[1] Firat Univ, Fac Technol, Dept Elect & Elect Engn, TR-23119 Elazig, Turkey
[2] King Abdulaziz Univ, Dept Elect & Comp Engn, Jeddah 21589, Saudi Arabia
[3] Victoria Univ, Inst Sustainable Ind & Liveable Cities, Footscray, Vic 3011, Australia
关键词
Electroencephalography; Feature extraction; Support vector machines; Erbium; Brain modeling; Continuous wavelet transforms; Filtering; EEG based emotion classification; EEG rhythms; CWT; deep features; pretrained CNN models; RECOGNITION;
D O I
10.1109/JSEN.2021.3070373
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Emotion recognition (ER) from Electroencephalogram (EEG) signals is a challenging task due to the non-linearity and non-stationarity nature of EEG signals. Existing feature extraction methods cannot extract the deep concealed characteristics of EEG signals from different layers for efficient classification scheme and also hard to select appropriate and effective feature extraction methods for different types of EEG data. Hence this study intends to develop an efficient deep feature extraction based method to automatically classify emotion status of people. In order to discover reliable deep features, five deep convolutional neural networks (CNN) models are considered: AlexNet, VGG16, ResNet50, SqueezeNet and MobilNetv2. Pre-processing, Wavelet Transform (WT), and Continuous Wavelet Transform (CWT) are employed to convert the EEG signals into EEG rhythm images then five well-known pretrained CNN models are employed for feature extraction. Finally, the proposed method puts the obtained features as input to the support vector machine (SVM) method for classifying them into binary emotion classes: valence and arousal classes. The DEAP dataset was used in experimental works. The experimental results demonstrate that the AlexNet features with Alpha rhythm produces better accuracy scores (91.07% in channel Oz) than the other deep features for the valence discrimination, and the MobilNetv2 features yields the highest accuracy score (98.93% in Delta rhythm (with channel C3) for arousal discrimination.
引用
收藏
页码:14923 / 14930
页数:8
相关论文
共 35 条
[1]   Classification of normal and depressed EEG signals based on centered correntropy of rhythms in empirical wavelet transform domain [J].
Akbari, Hesam ;
Sadiq, Muhammad Tariq ;
Rehman, Ateeq Ur .
HEALTH INFORMATION SCIENCE AND SYSTEMS, 2021, 9 (01)
[2]   A feature extraction technique based on tunable Q-factor wavelet transform for brain signal classification [J].
Al Ghayab, Hadi Ratham ;
Li, Yan ;
Siuly, S. ;
Abdulla, Shahab .
JOURNAL OF NEUROSCIENCE METHODS, 2019, 312 :43-52
[3]  
Al-Nafjan A, 2017, INT J ADV COMPUT SC, V8, P419
[4]   EEG-Based Emotion Recognition Using Quadratic Time-Frequency Distribution [J].
Alazrai, Rami ;
Homoud, Rasha ;
Alwanni, Hisham ;
Daoud, Mohammad I. .
SENSORS, 2018, 18 (08)
[5]   Improving BCI-based emotion recognition by combining EEG feature selection and kernel classifiers [J].
Atkinson, John ;
Campos, Daniel .
EXPERT SYSTEMS WITH APPLICATIONS, 2016, 47 :35-41
[6]   A tutorial on Support Vector Machines for pattern recognition [J].
Burges, CJC .
DATA MINING AND KNOWLEDGE DISCOVERY, 1998, 2 (02) :121-167
[7]  
Candra H, 2015, IEEE ENG MED BIO, P7250, DOI 10.1109/EMBC.2015.7320065
[8]   Emotion Recognition from Multiband EEG Signals Using CapsNet [J].
Chao, Hao ;
Dong, Liang ;
Liu, Yongli ;
Lu, Baoyun .
SENSORS, 2019, 19 (09)
[9]   Accurate EEG-Based Emotion Recognition on Combined Features Using Deep Convolutional Neural Networks [J].
Chen, J. X. ;
Zhang, P. W. ;
Mao, Z. J. ;
Huang, Y. F. ;
Jiang, D. M. ;
Zhang, Andy N. .
IEEE ACCESS, 2019, 7 :44317-44328
[10]   Emotion, cognition, and behavior [J].
Dolan, RJ .
SCIENCE, 2002, 298 (5596) :1191-1194