Four-class emotion classification in virtual reality using pupillometry

被引:26
作者
Zheng, Lim Jia [1 ]
Mountstephens, James [1 ]
Teo, Jason [1 ]
机构
[1] Univ Malaysia Sabah, Fac Comp & Informat, Jalan UMS, Kota Kinabalu 88400, Sabah, Malaysia
关键词
Emotion classification; Eye-tracking; Pupil diameter; Virtual reality; Machine learning;
D O I
10.1186/s40537-020-00322-9
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
BackgroundEmotion classification remains a challenging problem in affective computing. The large majority of emotion classification studies rely on electroencephalography (EEG) and/or electrocardiography (ECG) signals and only classifies the emotions into two or three classes. Moreover, the stimuli used in most emotion classification studies utilize either music or visual stimuli that are presented through conventional displays such as computer display screens or television screens. This study reports on a novel approach to recognizing emotions using pupillometry alone in the form of pupil diameter data to classify emotions into four distinct classes according to Russell's Circumplex Model of Emotions, utilizing emotional stimuli that are presented in a virtual reality (VR) environment. The stimuli used in this experiment are 360 degrees videos presented using a VR headset. Using an eye-tracker, pupil diameter is acquired as the sole classification feature. Three classifiers were used for the emotion classification which are Support Vector Machine (SVM), k-Nearest Neighbor (KNN), and Random Forest (RF).FindingsSVM achieved the best performance for the four-class intra-subject classification task at an average of 57.05% accuracy, which is more than twice the accuracy of a random classifier. Although the accuracy can still be significantly improved, this study reports on the first systematic study on the use of eye-tracking data alone without any other supplementary sensor modalities to perform human emotion classification and demonstrates that even with a single feature of pupil diameter alone, emotions could be classified into four distinct classes to a certain level of accuracy. Moreover, the best performance for recognizing a particular class was 70.83%, which was achieved by the KNN classifier for Quadrant 3 emotions.ConclusionThis study presents the first systematic investigation on the use of pupillometry as the sole feature to classify emotions into four distinct classes using VR stimuli. The ability to conduct emotion classification using pupil data alone represents a promising new approach to affective computing as new applications could be developed using readily-available webcams on laptops and other mobile devices that are equipped with cameras without the need for specialized and costly equipment such as EEG and/or ECG as the sensor modality.
引用
收藏
页数:9
相关论文
共 27 条
[1]  
Alhargan A., 2017, P 19 ACM INT C MULT, P479, DOI 10.1145/3136755.3137016
[2]   Video game scenery analysis with eye tracking [J].
Almeida, Samuel ;
Mealha, Oscar ;
Veloso, Ana .
ENTERTAINMENT COMPUTING, 2016, 14 :1-13
[3]  
Alsibai M.H., 2016, ARPN J ENG APPL SCI, V11, P10987
[4]  
[Anonymous], 2003, MINDS EYE, DOI DOI 10.1016/B978-044451020-4/50031-1
[5]   Neural Networks for Emotion Recognition Based on Eye Tracking Data [J].
Aracena, Claudio ;
Basterrech, Sebastian ;
Snasel, Vaclav ;
Velasquez, Juan .
2015 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC 2015): BIG DATA ANALYTICS FOR HUMAN-CENTRIC SYSTEMS, 2015, :2632-2637
[6]  
Basu Saikat, 2017, 2017 2nd International Conference on Communication and Electronics Systems (ICCES). Proceedings, P333, DOI 10.1109/CESYS.2017.8321292
[7]  
Bekele Esubalew, 2014, Virtual, Augmented and Mixed Reality. Applications of Virtual and Augmented Reality. 6th International Conference, VAMR 2014, Held as Part of HCI International 2014. Proceedings: LNCS 8526, P225, DOI 10.1007/978-3-319-07464-1_21
[8]  
Busjahn T., 2014, ACM P 10 ANN C INT C, DOI [10.1145/2632320.2632344, DOI 10.1145/2632320.2632344]
[9]   2015 11th International Conference on Signal-Image Technology & Internet-Based Systems [J].
Chanthaphan, Nattawat ;
Uchimura, Keiichi ;
Satonaka, Takami ;
Makioka, Tsuyoshi .
2015 11TH INTERNATIONAL CONFERENCE ON SIGNAL-IMAGE TECHNOLOGY & INTERNET-BASED SYSTEMS (SITIS), 2015, :117-124
[10]   Emotion in the perspective of an integrated nervous system [J].
Damasio, AR .
BRAIN RESEARCH REVIEWS, 1998, 26 (2-3) :83-86