Emotion Classification in Response to Tactile Enhanced Multimedia using Frequency Domain Features of Brain Signals

被引:0
作者
Raheel, Aasim [1 ]
Majid, Muhammad [1 ]
Anwar, Syed Muhammad [2 ]
Bagci, Ulas [2 ]
机构
[1] Univ Engn & Technol, Dept Comp Engn, Signal Image Multimedia Proc & LEarning SIMPLE Re, Taxila 47050, Pakistan
[2] Univ Cent Florida, Ctr Res Comp Vis CRCV, Orlando, FL 32816 USA
来源
2019 41ST ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY (EMBC) | 2019年
关键词
RECOGNITION;
D O I
10.1109/embc.2019.8857632
中图分类号
R318 [生物医学工程];
学科分类号
0831 ;
摘要
Tactile enhanced multimedia is generated by synchronizing traditional multimedia clips, to generate hot and cold air effect, with an electric heater and a fan. This objective is to give viewers a more realistic and immersing feel of the multimedia content. The response to this enhanced multimedia content (mulsemedia) is evaluated in terms of the appreciation/emotion by using human brain signals. We observe and record electroencephalography (EEG) data using a commercially available four channel MUSE headband. A total of 21 participants voluntarily participated in this study for EEG recordings. We extract frequency domain features from five different bands of each EEG channel. Four emotions namely: happy, relaxed, sad, and angry are classified using a support vector machine in response to the tactile enhanced multimedia. An increased accuracy of 76.19% is achieved when compared to 63.41% by using the time domain features. Our results show that the selected frequency domain features could be better suited for emotion classification in mulsemedia studies.
引用
收藏
页码:1201 / 1204
页数:4
相关论文
共 18 条
[1]   Synchronization of Olfaction-Enhanced Multimedia [J].
Ademoye, Oluwakemi A. ;
Ghinea, Gheorghita .
IEEE TRANSACTIONS ON MULTIMEDIA, 2009, 11 (03) :561-565
[2]   Issues and assumptions on the road from raw signals to metrics of frontal EEG asymmetry in emotion [J].
Allen, JJB ;
Coan, JA ;
Nazarian, M .
BIOLOGICAL PSYCHOLOGY, 2004, 67 (1-2) :183-218
[3]   Music induced emotion using wavelet packet decomposition-An EEG study [J].
Balasubramanian, Geethanjali ;
Kanagasabai, Adalarasu ;
Mohan, Jagannath ;
Seshadri, N. P. Guhan .
BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2018, 42 :115-128
[4]   Human emotion recognition and analysis in response to audio music using brain signals [J].
Bhatti, Adnan Mehmood ;
Majid, Muhammad ;
Anwar, Syed Muhammad ;
Khan, Bilal .
COMPUTERS IN HUMAN BEHAVIOR, 2016, 65 :267-275
[5]  
Fonseca, 2017, IEEE T AFFECT COMPUT, V99, P1, DOI [DOI 10.1109/TAFFC.2017.2714671, 10.1109/TAFFC.2017.2714671]
[6]   Mulsemedia: State of the Art, Perspectives, and Challenges [J].
Ghinea, Gheorghita ;
Timmerer, Christian ;
Lin, Weisi ;
Gulliver, Stephen R. .
ACM TRANSACTIONS ON MULTIMEDIA COMPUTING COMMUNICATIONS AND APPLICATIONS, 2014, 11 (01)
[7]   Understanding the Affect of Developers: Theoretical Background and Guidelines for Psychoempirical Software Engineering [J].
Graziotin, Daniel ;
Wang, Xiaofeng ;
Abrahamsson, Pekka .
7TH INTERNATIONAL WORKSHOP ON SOCIAL SOFTWARE ENGINEERING (SSE 2015), 2015, :25-32
[8]  
Jalilifard A, 2016, IEEE ENG MED BIO, P845, DOI 10.1109/EMBC.2016.7590833
[9]   Real-Time EEG-Based Happiness Detection System [J].
Jatupaiboon, Noppadon ;
Pan-ngum, Setha ;
Israsena, Pasin .
SCIENTIFIC WORLD JOURNAL, 2013,
[10]  
Liu S, 2016, IEEE ENG MED BIO, P841, DOI 10.1109/EMBC.2016.7590832