Acquisition and Analysis of Facial Electromyographic Signals for Emotion Recognition

被引:1
作者
Kolodziej, Marcin [1 ]
Majkowski, Andrzej [1 ]
Jurczak, Marcin [1 ]
机构
[1] Warsaw Univ Technol, Fac Elect Engn, Pl Politech 1, PL-00661 Warsaw, Poland
关键词
electromyography; EMG; signal analysis; emotion recognition; expression recognition; facial analysis; WEARABLE DEVICE; EXPRESSIONS;
D O I
10.3390/s24154785
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
The objective of the article is to recognize users' emotions by classifying facial electromyographic (EMG) signals. A biomedical signal amplifier, equipped with eight active electrodes positioned in accordance with the Facial Action Coding System, was used to record the EMG signals. These signals were registered during a procedure where users acted out various emotions: joy, sadness, surprise, disgust, anger, fear, and neutral. Recordings were made for 16 users. The mean power of the EMG signals formed the feature set. We utilized these features to train and evaluate various classifiers. In the subject-dependent model, the average classification accuracies were 96.3% for KNN, 94.9% for SVM with a linear kernel, 94.6% for SVM with a cubic kernel, and 93.8% for LDA. In the subject-independent model, the classification results varied depending on the tested user, ranging from 91.4% to 48.6% for the KNN classifier, with an average accuracy of 67.5%. The SVM with a cubic kernel performed slightly worse, achieving an average accuracy of 59.1%, followed by the SVM with a linear kernel at 53.9%, and the LDA classifier at 41.2%. Additionally, the study identified the most effective electrodes for distinguishing between pairs of emotions.
引用
收藏
页数:18
相关论文
共 40 条
[1]  
Ang L. B. P., 2004, TENCON 2004. 2004 IEEE Region 10 Conference (IEEE Cat. No. 04CH37582), P600
[2]  
Barigala Vinay Kumar, 2023, Stud Health Technol Inform, V305, P81, DOI 10.3233/SHTI230429
[3]  
Beale R, 2008, LECT NOTES COMPUT SC, V4868, P1, DOI 10.1007/978-3-540-85099-1_1
[4]   A survey on facial emotion recognition techniques: A state-of-the-art literature review [J].
Canal, Felipe Zago ;
Mueller, Tobias Rossi ;
Matias, Jhennifer Cristine ;
Scotton, Gustavo Gino ;
de Sa, Antonio Reis ;
Pozzebon, Eliane ;
Sobieranski, Antonio Carlos .
INFORMATION SCIENCES, 2022, 582 :593-617
[5]   Emotion Recognition With Audio, Video, EEG, and EMG: A Dataset and Baseline Approaches [J].
Chen, Jin ;
Ro, Tony ;
Zhu, Zhigang .
IEEE ACCESS, 2022, 10 :13229-13242
[6]  
Ekman P., 1978, TECHNIQUE MEASUREMEN, P22
[7]  
Ekman P., 2002, FACIAL ACTION CODING, P77
[8]   Emotion recognition in human-computer interaction [J].
Fragopanagos, N ;
Taylor, JG .
NEURAL NETWORKS, 2005, 18 (04) :389-405
[9]   Design of a Wearable Device for Reading Positive Expressions from Facial EMG Signals [J].
Gruebler, Anna ;
Suzuki, Kenji .
IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, 2014, 5 (03) :227-237
[10]   Measurement of distal EMG Signals using a Wearable Device for Reading Facial Expressions [J].
Gruebler, Anna ;
Suzuki, Kenji .
2010 ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY (EMBC), 2010, :4594-4597