A system for feature classification of emotions based on Speech Analysis; Applications to Human-Robot Interaction

被引:0
作者
Rabiei, Mohammad [1 ]
Gasparetto, Alessandro [1 ]
机构
[1] Univ Udine, Dept Elect Engn Mech Engn & Management, Via Sci 206, I-33100 Udine, Italy
来源
2014 SECOND RSI/ISM INTERNATIONAL CONFERENCE ON ROBOTICS AND MECHATRONICS (ICROM) | 2014年
关键词
formant; pitch; speech analysis; speech rate; SPECTRAL FEATURES; RECOGNITION;
D O I
暂无
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
A system for recognition of emotions based on speech analysis can have interesting applications in human robot interaction. Robot should make a proper mutual communication between sound recognition and perception for creating a desired emotional interaction with humans. Advanced research in this field will be based on sound analysis and recognition of emotions in spontaneous dialog. In this paper, we report the results obtained from an exploratory study on a methodology to automatically recognize and classify basic emotional states. The study attempted to investigate the appropriateness of using acoustic and phonetic properties of emotive speech with the minimal use of signal processing algorithms. The efficiency of the methodology was evaluated by experimental tests on adult European speakers. The speakers had to repeat six simple sentences in English language in order to emphasize features of the pitch (peak, value and range), the intensity of the speech, the formants and the speech rate. The proposed methodology using the freeware program (PRAAT) and consists of generating and analyzing a graph of pitch, formant and intensity of speech signals for classify basic emotion. Eventually, the proposed model provided successful recognition of the basic emotion in most of the cases.
引用
收藏
页码:795 / 800
页数:6
相关论文
共 50 条
  • [11] Coupled Multimodal Emotional Feature Analysis Based on Broad-Deep Fusion Networks in Human-Robot Interaction
    Chen, Luefeng
    Li, Min
    Wu, Min
    Pedrycz, Witold
    Hirota, Kaoru
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (07) : 9663 - 9673
  • [12] Design and Evaluation of a Framework for Reciprocal Speech Interaction in Human-Robot Collaboration
    Bolano, Gabriele
    Iviani, Lawrence
    Roennau, Arne
    Dillmann, Ruediger
    2021 30TH IEEE INTERNATIONAL CONFERENCE ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION (RO-MAN), 2021, : 806 - 812
  • [13] Sustainable Human-Robot Collaboration Based on Human Intention Classification
    Lin, Chiuhsiang Joe
    Lukodono, Rio Prasetyo
    SUSTAINABILITY, 2021, 13 (11)
  • [14] Human-Robot Interaction based on Facial Expression Imitation
    Esfandbod, Alireza
    Rokhi, Zeynab
    Taheri, Alireza
    Alemi, Minoo
    Meghdari, Ali
    2019 7TH INTERNATIONAL CONFERENCE ON ROBOTICS AND MECHATRONICS (ICROM 2019), 2019, : 69 - 73
  • [15] Classification of Hand Postures Based on 3D Vision Model for Human-Robot Interaction
    Takimoto, Hironori
    Yoshimori, Seiki
    Mitsukura, Yasue
    Fukumi, Minoru
    2010 IEEE RO-MAN, 2010, : 292 - 297
  • [16] Interaction modeling and classification scheme for augmenting the response accuracy of human-robot interaction systems
    Tao, Hai
    Rahman, Md Arafatur
    Jing, Wang
    Li, Yafeng
    Li, Jing
    Al-Saffar, Ahmed
    Zhang, Renrui
    Salih, Sinan Q.
    WORK-A JOURNAL OF PREVENTION ASSESSMENT & REHABILITATION, 2021, 68 (03): : 903 - 912
  • [17] Augmented Robotics Dialog System for Enhancing Human-Robot Interaction
    Alonso-Martin, Fernando
    Castro-Gonzalez, Alvaro
    de Gorostiza Luengo, Francisco Javier Fernandez
    Angel Salichs, Miguel
    SENSORS, 2015, 15 (07): : 15799 - 15829
  • [18] A Multimodal Emotion Detection System during Human-Robot Interaction
    Alonso-Martin, Fernando
    Malfaz, Maria
    Sequeira, Joao
    Gorostiza, Javier F.
    Salichs, Miguel A.
    SENSORS, 2013, 13 (11) : 15549 - 15581
  • [19] A real-time Human-Robot Interaction system based on gestures for assistive scenarios
    Canal, Gerard
    Escalera, Sergio
    Angulo, Cecilio
    COMPUTER VISION AND IMAGE UNDERSTANDING, 2016, 149 : 65 - 77
  • [20] Pantomimic Gestures for Human-Robot Interaction
    Burke, Michael
    Lasenby, Joan
    IEEE TRANSACTIONS ON ROBOTICS, 2015, 31 (05) : 1225 - 1237