Human Emotion Detection through Speech and Facial Expressions

被引:0
|
作者
Kudiri, Krishna Mohan [1 ]
Said, Abas Md [1 ]
Nayan, M. Yunus [2 ]
机构
[1] Univ Teknol PETRONAS, Dept Comp & Informat Sci, Bandar Seri Iskandar, Perak, Malaysia
[2] Univ Teknol PETRONAS, Dept Appl Sci, Bandar Seri Iskandar, Perak, Malaysia
关键词
Relative Bin Frequency Coefficients (RBFC); Relative Sub-Image Based (RSB); Support Vector Machine (SVM);
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Estimation of human emotions using a computer has been difficult since when human engaged in a conversational secession. In this research work, a proposed hybrid system through facial expressions and speech is used to estimate basic emotions (angry, sad, happy, boredom, disgust and surprise) of a person when he is engaged in a conversational secession. Relative Bin Frequency Coefficients and Relative Sub-Image Based features are used for acoustic and visual data respectively. Support Vector Machine with radial basis kernel is used for classification. This research work revealed that the proposed feature extraction through speech and facial expression is the most prominent aspect affecting the emotion detection system accompanied by proposed fusion technique. Although, some aspects considered affecting the emotion detection system, this affect is relatively minor. It was observed that the performance of the bimodal emotion detection system is low than the unimodal emotion detection system through deliberate facial expressions. In order to deal with the issue, a suitable database is used. The results indicated that the proposed emotion detection system showed better performance with respect to basic emotional classes than the rest.
引用
收藏
页码:351 / 356
页数:6
相关论文
共 50 条
  • [1] Emotion Detection through Speech and Facial Expressions
    Kudiri, Krishna Mohan
    Said, Abas Md
    Nayan, M. Yunus
    2014 INTERNATIONAL CONFERENCE ON COMPUTER ASSISTED SYSTEM IN HEALTH (CASH 2014), 2014, : 26 - 31
  • [2] Human Emotion Detection through Facial Expressions for Commercial Analysis
    Ruiz, Limuel Z.
    Alomia, Renmill Patrick V.
    Dantis, A. Dominic Q.
    Diego, Mark Joseph S. San
    Tindugan, Charlymiah F.
    Serrano, Kanny Krizzy D.
    2017 IEEE 9TH INTERNATIONAL CONFERENCE ON HUMANOID, NANOTECHNOLOGY, INFORMATION TECHNOLOGY, COMMUNICATION AND CONTROL, ENVIRONMENT AND MANAGEMENT (IEEE HNICEM), 2017,
  • [3] HUMAN EMOTION ESTIMATION THROUGH FACIAL EXPRESSIONS
    LEE, ET
    KYBERNETES, 1994, 23 (01) : 39 - 46
  • [4] Facial expressions of emotion in speech and singing
    Scotto di Carlo, N
    Guaïtella, I
    SEMIOTICA, 2004, 149 (1-4) : 37 - 55
  • [5] Unobtrusive multimodal emotion detection in adaptive interfaces: Speech and facial expressions
    Truong, Khiet P.
    van Leeuwen, David A.
    Neerincx, Mark A.
    FOUNDATIONS OF AUGMENTED COGNITION, PROCEEDINGS, 2007, 4565 : 354 - +
  • [6] Emotion Detection Using Relative Grid based Coefficients through Human Facial Expressions
    Kudiri, Krishna Mohan
    Said, Abas Md
    Nayan, M. Yunus
    2013 INTERNATIONAL CONFERENCE ON RESEARCH AND INNOVATION IN INFORMATION SYSTEMS (ICRIIS), 2013, : 45 - 48
  • [7] Alexithymia and detection of facial expressions of emotion.
    Prkachin, GC
    Prkachin, KM
    PSYCHOSOMATIC MEDICINE, 2001, 63 (01): : 135 - 136
  • [8] Common cues to emotion in the dynamic facial expressions of speech and song
    Livingstone, Steven R.
    Thompson, William F.
    Wanderley, Marcelo M.
    Palmer, Caroline
    QUARTERLY JOURNAL OF EXPERIMENTAL PSYCHOLOGY, 2015, 68 (05): : 952 - 970
  • [9] Bimodal Approach in Emotion Recognition using Speech and Facial Expressions
    Emerich, Simina
    Lupu, Eugen
    Apatean, Anca
    ISSCS 2009: INTERNATIONAL SYMPOSIUM ON SIGNALS, CIRCUITS AND SYSTEMS, VOLS 1 AND 2, PROCEEDINGS,, 2009, : 297 - 300
  • [10] Multimodal Emotion Recognition Based on Facial Expressions, Speech, and EEG
    Pan, Jiahui
    Fang, Weijie
    Zhang, Zhihang
    Chen, Bingzhi
    Zhang, Zheng
    Wang, Shuihua
    IEEE OPEN JOURNAL OF ENGINEERING IN MEDICINE AND BIOLOGY, 2024, 5 : 396 - 403