A Methodology for Recognition of Emotions Based on Speech Analysis, for Applications to Human-Robot Interaction. An Exploratory Study

被引:0
作者
机构
[1] Diegm, Università di Udine, Via delle Scienze 206, Udine
来源
| 1600年 / De Gruyter Open Ltd卷 / 05期
关键词
Emotion; Human-Robot Interaction; Speech Analysis;
D O I
10.2478/pjbr-2014-0001
中图分类号
学科分类号
摘要
A system for recognition of emotions based on speech analysis can have interesting applications in human-robot interaction. In this paper, we carry out an exploratory study on the possibility to use a proposed methodology to recognize basic emotions (sadness, surprise, happiness, anger, fear and disgust) based on phonetic and acoustic properties of emotive speech with the minimal use of signal processing algorithms. We set up an experimental test, consisting of choosing three types of speakers, namely: (i) five adult European speakers, (ii) five Asian (Middle East) adult speakers and (iii) five adult American speakers. The speakers had to repeat 6 sentences in English (with durations typically between 1 s and 3 s) in order to emphasize rising-falling intonation and pitch movement. Intensity, peak and range of pitch and speech rate have been evaluated. The proposed methodology consists of generating and analyzing a graph of formant, pitch and intensity, using the open-source PRAAT program. From the experimental results, it was possible to recognize the basic emotions in most of the cases © 2014 Mohammad Rabiei and Alessandro Gasparetto.
引用
收藏
页码:1 / 11
页数:10
相关论文
共 50 条
[31]   Visual recognition of pointing gestures for human-robot interaction [J].
Nickel, Kai ;
Stiefelhagen, Rainer .
IMAGE AND VISION COMPUTING, 2007, 25 (12) :1875-1884
[32]   The Design of Face Recognition and Tracking for Human-Robot Interaction [J].
Sanjaya, W. S. Mada ;
Anggraeni, Dyah ;
Zakaria, Kiki ;
Juwardi, Atip ;
Munawwaroh, Madinatul .
2017 2ND INTERNATIONAL CONFERENCES ON INFORMATION TECHNOLOGY, INFORMATION SYSTEMS AND ELECTRICAL ENGINEERING (ICITISEE): OPPORTUNITIES AND CHALLENGES ON BIG DATA FUTURE INNOVATION, 2017, :315-320
[33]   Affective Facial Expressions Recognition for Human-Robot Interaction [J].
Faria, Diego R. ;
Vieira, Mario ;
Faria, Fernanda C. C. ;
Premebida, Cristiano .
2017 26TH IEEE INTERNATIONAL SYMPOSIUM ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION (RO-MAN), 2017, :805-810
[34]   Human-robot interaction: Development of an evaluation methodology for the bystander role of interaction [J].
Scholtz, J ;
Bahrami, S .
2003 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN AND CYBERNETICS, VOLS 1-5, CONFERENCE PROCEEDINGS, 2003, :3212-3217
[35]   Face Detection and Recognition with SURF for Human-Robot Interaction [J].
An, Shan ;
Ma, Xin ;
Song, Rui ;
Li, Yibin .
2009 IEEE INTERNATIONAL CONFERENCE ON AUTOMATION AND LOGISTICS ( ICAL 2009), VOLS 1-3, 2009, :1945-1950
[36]   Face Recognition and Tracking Framework for Human-Robot Interaction [J].
Khalifa, Aly ;
Abdelrahman, Ahmed A. ;
Strazdas, Dominykas ;
Hintz, Jan ;
Hempel, Thorsten ;
Al-Hamadi, Ayoub .
APPLIED SCIENCES-BASEL, 2022, 12 (11)
[37]   Efficient Hand Gesture Recognition for Human-Robot Interaction [J].
Peral, Marc ;
Sanfeliu, Alberto ;
Garrell, Anais .
IEEE ROBOTICS AND AUTOMATION LETTERS, 2022, 7 (04) :10272-10279
[38]   Person localization and posture recognition for human-robot interaction [J].
Boehme, HJ ;
Braumann, UD ;
Corradini, A ;
Gross, HM .
GESTURE-BASED COMMUNICATION IN HUMAN-COMPUTER INTERACTION, 1999, 1739 :117-128
[39]   Dynamic stability analysis of a tracked mobile robot based on human-robot interaction [J].
Zong, Chengguo ;
Ji, Zhijian ;
Yu, Haisheng .
ASSEMBLY AUTOMATION, 2020, 40 (01) :143-154
[40]   A Systematic Review of Human-Robot Interaction: The Use of Emotions and the Evaluation of Their Performance [J].
Ottoni, Lara Toledo Cordeiro ;
Cerqueira, Jes de Jesus Fiais .
INTERNATIONAL JOURNAL OF SOCIAL ROBOTICS, 2024, 16 (11-12) :2169-2188