An Emotion Recognition Method Based on Eye Movement and Audiovisual Features in MOOC Learning Environment

被引:10
作者
Bao, Jindi [1 ]
Tao, Xiaomei [2 ,3 ]
Zhou, Yinghui [1 ]
机构
[1] Guilin Univ Technol, Sch Informat Sci & Engn, Guilin 541004, Peoples R China
[2] Guangxi Normal Univ, Sch Comp Sci & Engn, Guilin 541000, Peoples R China
[3] Guangxi Normal Univ, Sch Software, Guilin 541000, Peoples R China
基金
美国国家科学基金会;
关键词
Emotion recognition; feature extraction; massive online open course (MOOC); multimodal analysis;
D O I
10.1109/TCSS.2022.3221128
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
In recent years, more and more people have begun to use massive online open course (MOOC) platforms for distance learning. However, due to the space-time isolation between teachers and students, the negative emotional state of students in MOOC learning cannot be identified timely. Therefore, students cannot receive immediate feedback about their emotional states. In order to identify and classify learners' emotions in video learning scenarios, we propose a multimodal emotion recognition method based on eye movement signals, audio signals, and video images. In this method, two novel features are proposed: feature of coordinate difference of eyemovement (FCDE) and pixel change rate sequence (PCRS). FCDE is extracted by combining eye movement coordinate trajectory and video optical flow trajectory, which can represent the learner's attention degree. PCRS is extracted from the video image, which can represent the speed of image switching. A feature extraction network based on convolutional neural network (CNN) (FE-CNN) is designed to extract the deep features of the three modals. The extracted deep features are inputted into the emotion classification CNN (EC-CNN) to classify the emotions, including interest, happiness, confusion, and boredom. In single modal identification, the recognition accuracies corresponding to the three modals are 64.32%, 74.67%, and 71.88%. The three modals are fused by feature-level fusion, decision-level fusion, and model-level fusion methods, and the evaluation experiment results show that the method of decision-level fusion achieved the highest score of 81.90% of emotion recognition. Finally, the effectiveness of FCDE, FE-CNN, and EC-CNN modules is verified by ablation experiments.
引用
收藏
页码:171 / 183
页数:13
相关论文
共 50 条
  • [31] End-to-End Modeling and Transfer Learning for Audiovisual Emotion Recognition in-the-Wild
    Dresvyanskiy, Denis
    Ryumina, Elena
    Kaya, Heysem
    Markitantov, Maxim
    Karpov, Alexey
    Minker, Wolfgang
    MULTIMODAL TECHNOLOGIES AND INTERACTION, 2022, 6 (02)
  • [32] Learning DenseNet features from EEG based spectrograms for subject independent emotion recognition
    Pusarla, Nalini
    Singh, Anurag
    Tripathi, Shrivishal
    BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2022, 74
  • [33] Deep Learning Model With Adaptive Regularization for EEG-Based Emotion Recognition Using Temporal and Frequency Features
    Samavat, Alireza
    Khalili, Ebrahim
    Ayati, Bentolhoda
    Ayati, Marzieh
    IEEE ACCESS, 2022, 10 : 24520 - 24527
  • [34] Electroencephalogram Emotion Recognition Based on Manifold Geomorphological Features in Riemannian Space
    Wang, Yanbing
    He, Hong
    IEEE INTELLIGENT SYSTEMS, 2024, 39 (04) : 23 - 36
  • [35] Emotion Recognition by a Hybrid System Based on the Features of Distances and the Shapes of the Wrinkles
    Afdhal, Rim
    Ejbali, Ridha
    Zaied, Mourad
    COMPUTER JOURNAL, 2020, 63 (03) : 351 - 363
  • [36] Wavelet Packet Energy Features for EEG-Based Emotion Recognition
    Algumaei, M. Y.
    Hettiarachchi, Imali T.
    Veerabhadrappa, Rakesh
    Bhatti, Asim
    2021 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC), 2021, : 1935 - 1940
  • [37] Can Emotion Be Transferred?-A Review on Transfer Learning for EEG-Based Emotion Recognition
    Li, Wei
    Huan, Wei
    Hou, Bowen
    Tian, Ye
    Zhang, Zhen
    Song, Aiguo
    IEEE TRANSACTIONS ON COGNITIVE AND DEVELOPMENTAL SYSTEMS, 2022, 14 (03) : 833 - 846
  • [38] Deep Learning for EEG-based Emotion Recognition: A Survey
    Li J.-Y.
    Du X.-B.
    Zhu Z.-L.
    Deng X.-M.
    Ma C.-X.
    Wang H.-A.
    Ruan Jian Xue Bao/Journal of Software, 2023, 34 (01): : 255 - 276
  • [39] A review on EEG-based multimodal learning for emotion recognition
    Pillalamarri, Rajasekhar
    Shanmugam, Udhayakumar
    ARTIFICIAL INTELLIGENCE REVIEW, 2025, 58 (05)
  • [40] Emotion recognition and interaction of smart education environment screen based on deep learning networks
    Zhao, Wei
    Qiu, Liguo
    JOURNAL OF INTELLIGENT SYSTEMS, 2025, 34 (01)