Modeling the Affective Space of 360 Virtual Reality Videos Based on Arousal and Valence for Wearable EEG-Based VR Emotion Classification

被引:0
作者
Suhaimi, Nazmi Sofian [1 ]
Yuan, Chrystalle Tan Bih [2 ]
Teo, Jason [3 ]
Mountstephens, James [3 ]
机构
[1] Univ Malaysia Sabah, Fac Comp & Informat, Evolutionary Comp Lab, Kota Kinabalu, Malaysia
[2] Univ Malaysia Sabah, Fac Med & Hlth Sci, Kota Kinabalu, Malaysia
[3] Univ Malaysia Sabah, Fac Comp & Informat, Kota Kinabalu, Malaysia
来源
2018 IEEE 14TH INTERNATIONAL COLLOQUIUM ON SIGNAL PROCESSING & ITS APPLICATIONS (CSPA 2018) | 2018年
关键词
Emotion Classification; Wearable EEG; Machine Learning; Electroencephalography; Virtual Reality; Emotion Detection Dataset;
D O I
暂无
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
This study attempts to produce a novel database for emotional analysis which uses virtual reality (VR) contents, obtained from third party sources such as YouTube, Discovery VR, Jaunt VR, NYT VR, Veer VR and Google Cardboard, as the visual stimuli in the classification of emotion using commercial-of-the-shelf (COTS) wearable electroencephalography (EEG) headsets. While there are available sources for emotional analysis such as Dataset for Emotion Analysis using EEG, Physiological and video signals (DEAP) dataset presented by Koelstra et al. and Database for Emotional Analysis in Music (DEAM) dataset by Soleymani et al, their contents are focused on using music stimuli and music video stimuli. The database which will be presented here will consist of novel affective taggings using virtual reality content, specifically on Youtube 360 videos, as evaluated by 15 participants based on the Arousal-Valence emotion model (AVS). The feedback obtained from these evaluations will serve as the underlying dataset for the next stage of machine learning i mplementation, which is the targeted emotion classification of virtual reality stimuli using wearable EEG headsets.
引用
收藏
页码:167 / 172
页数:6
相关论文
共 10 条
[1]  
[Anonymous], 2013, C HUM SYST INTERACT, DOI DOI 10.1109/HSI.2013.6577880
[2]  
Cellan-Jones R., 2016, BBC News
[3]   2015 11th International Conference on Signal-Image Technology & Internet-Based Systems [J].
Chanthaphan, Nattawat ;
Uchimura, Keiichi ;
Satonaka, Takami ;
Makioka, Tsuyoshi .
2015 11TH INTERNATIONAL CONFERENCE ON SIGNAL-IMAGE TECHNOLOGY & INTERNET-BASED SYSTEMS (SITIS), 2015, :117-124
[4]  
Chenchah F, 2015, INT J ADV COMPUT SC, V6, P135
[5]   DEAP: A Database for Emotion Analysis Using Physiological Signals [J].
Koelstra, Sander ;
Muhl, Christian ;
Soleymani, Mohammad ;
Lee, Jong-Seok ;
Yazdani, Ashkan ;
Ebrahimi, Touradj ;
Pun, Thierry ;
Nijholt, Anton ;
Patras, Ioannis .
IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, 2012, 3 (01) :18-31
[6]   Emotion recognition from speech: a review [J].
Koolagudi, Shashidhar G. ;
Rao, K. Sreenivasa .
INTERNATIONAL JOURNAL OF SPEECH TECHNOLOGY, 2012, 15 (02) :99-117
[7]   Toward machine emotional intelligence: Analysis of affective physiological state [J].
Picard, RW ;
Vyzas, E ;
Healey, J .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2001, 23 (10) :1175-1191
[8]   DEFINING VIRTUAL REALITY - DIMENSIONS DETERMINING TELEPRESENCE [J].
STEUER, J .
JOURNAL OF COMMUNICATION, 1992, 42 (04) :73-93
[9]  
Suchitra, 2016, 2016 3rd International Conference on Signal Processing and Integrated Networks (SPIN), P666, DOI 10.1109/SPIN.2016.7566780
[10]   Facial Expression Recognition with Emotion-Based Feature Fusion [J].
Turan, Cigdem ;
Lam, Kin-Man ;
He, Xiangjian .
2015 ASIA-PACIFIC SIGNAL AND INFORMATION PROCESSING ASSOCIATION ANNUAL SUMMIT AND CONFERENCE (APSIPA), 2015,