Efficient Decoding of Affective States from Video-elicited EEG Signals: An Empirical Investigation

被引:0
|
作者
Latifzadeh, Kayhan [1 ]
Gozalpour, Nima [1 ]
Traver, V. Javier [2 ]
Ruotsalo, Tuukka [3 ,4 ]
Kawala-Sterni, Aleksandra [5 ]
Leiva, Luis A. [1 ]
机构
[1] Univ Luxembourg, Esch Sur Alzette, Luxembourg
[2] Univ Jaume 1, INIT, Castellon de la Plana, Spain
[3] Univ Copenhagen, Copenhagen, Denmark
[4] LUT Univ, Lahti, Finland
[5] Opole Univ Technol, Opole, Poland
基金
欧盟地平线“2020”; 芬兰科学院;
关键词
Human-centered computing; User models; Ambient intelligence; Computing methodologies; Machine learning approaches; EMOTION RECOGNITION; EYE TRACKING; FRAMEWORK; SELECTION; FEATURES;
D O I
10.1145/3663669
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Affect decoding through brain-computer interfacing (BCI) holds great potential to capture users' feelings and emotional responses via non-invasive electroencephalogram (EEG) sensing. Yet, little research has been conducted to understand efficient decoding when users are exposed to dynamic audiovisual contents. In this regard, we study EEG-based affect decoding from videos in arousal and valence classification tasks, considering the impact of signal length, window size for feature extraction, and frequency bands. We train both classic Machine Learning models (SVMs and k-NNs) and modern Deep Learning models (FCNNs and GTNs). Our results show that: (1) affect can be effectively decoded using less than 1 minute of EEG signal; (2) temporal windows of 6 and 10 seconds provide the best classification performance for classic Machine Learning models but Deep Learning models benefit from much shorter windows of 2 seconds; and (3) any model trained on the Beta band alone achieves similar (sometimes better) performance than when trained on all frequency bands. Taken together, our results indicate that affect decoding can work in more realistic conditions than currently assumed, thus becoming a viable technology for creating better interfaces and user models.
引用
收藏
页数:24
相关论文
共 24 条
  • [1] Fusing highly dimensional energy and connectivity features to identify affective states from EEG signals
    Arnau-Gonzalez, Pablo
    Arevalillo-Herraez, Miguel
    Ramzan, Naeem
    NEUROCOMPUTING, 2017, 244 : 81 - 89
  • [2] Affective recognition from EEG signals: an integrated data-mining approach
    Mendoza-Palechor, Fabio
    Menezes, Maria Luiza
    Sant'Anna, Anita
    Ortiz-Barrios, Miguel
    Samara, Anas
    Galway, Leo
    JOURNAL OF AMBIENT INTELLIGENCE AND HUMANIZED COMPUTING, 2019, 10 (10) : 3955 - 3974
  • [3] An Efficient LSTM Network for Emotion Recognition From Multichannel EEG Signals
    Du, Xiaobing
    Ma, Cuixia
    Zhang, Guanhua
    Li, Jinyao
    Lai, Yu-Kun
    Zhao, Guozhen
    Deng, Xiaoming
    Liu, Yong-Jin
    Wang, Hongan
    IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, 2022, 13 (03) : 1528 - 1540
  • [4] Affective states classification performance of audio-visual stimuli from EEG signals with multiple-instance learning
    Dasdemir, Yaar
    Ozakar, Rustem
    TURKISH JOURNAL OF ELECTRICAL ENGINEERING AND COMPUTER SCIENCES, 2022, 30 (07) : 2707 - 2724
  • [5] Analysis of working memory from EEG signals under different emotional states
    Barkana, Buket D.
    Ozkan, Yusuf
    Badara, Joanna A.
    BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2022, 71
  • [6] Decoding Color Visual Working Memory from EEG Signals Using Graph Convolutional Neural Networks
    Che, Xiaowei
    Zheng, Yuanjie
    Chen, Xin
    Song, Sutao
    Li, Shouxin
    INTERNATIONAL JOURNAL OF NEURAL SYSTEMS, 2022, 32 (02)
  • [7] Investigation of human state classification via EEG signals elicited by emotional audio-visual stimulation
    Lee, Wonyoung
    Son, Guiyoung
    MULTIMEDIA TOOLS AND APPLICATIONS, 2023, 83 (29) : 73217 - 73231
  • [8] Decoding auditory-evoked response in affective states using wearable around-ear EEG system
    Choi, Jaehoon
    Kaongoen, Netiwit
    Choi, HyoSeon
    Kim, Minuk
    Kim, Byung Hyung
    Jo, Sungho
    BIOMEDICAL PHYSICS & ENGINEERING EXPRESS, 2023, 9 (05)
  • [9] Emotion Recognition from EEG Signals by Using Empirical Mode Decomposition
    Degirmenci, Murside
    Ozdemir, Mehmet Akif
    Sadighzadeh, Reza
    Akan, Aydin
    2018 MEDICAL TECHNOLOGIES NATIONAL CONGRESS (TIPTEKNO), 2018,
  • [10] Decoding Individual Finger Movements from One Hand Using Human EEG Signals
    Liao, Ke
    Xiao, Ran
    Gonzalez, Jania
    Ding, Lei
    PLOS ONE, 2014, 9 (01):