Expressive Robotic Head for Human-Robot Interaction Studies

被引:0
作者
Pereira, Ricardo [1 ]
Garrote, Luis [1 ,2 ]
Barros, Tiago [1 ]
Carona, Carlos [3 ]
Bento, Luis C. [1 ]
Nunes, Urbano J. [1 ,2 ]
机构
[1] Univ Coimbra, Inst Syst & Robot, Coimbra, Portugal
[2] Univ Coimbra, Dept Elect & Comp Engn, Coimbra, Portugal
[3] Univ Coimbra, Ctr Res Neuropsychol & Cognit Behav Intervent, Cerebral Palsy Assoc Coimbra, Coimbra, Portugal
来源
XV MEDITERRANEAN CONFERENCE ON MEDICAL AND BIOLOGICAL ENGINEERING AND COMPUTING - MEDICON 2019 | 2020年 / 76卷
关键词
Human-robot interaction; LCD-based robotic head; Facial expressions; Speech recognition;
D O I
10.1007/978-3-030-31635-8_200
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper we present an improvement of the ISR-RobotHead prototype interaction capabilities having in view future Human-Robot Interactions studies. To accomplish that, a new hardware and software architectures and new facial expressions were developed. This new prototype (ISR-RobotHead v2.0) uses LCDs to display six human facial emotional expressions, avoiding the use of interlinked mechanical systems. The prototype also incorporates cameras, microphones, speakers, and LEDs on the robot's cheeks. To validate the new cartoon facial expressions, an exploratory study was performed with children (4-6 years old) and adults (23-24 years old), who were asked to identify the expression displayed by the ISR-RobotHead v2.0; for children, the evaluation of accuracy in the detection of emotional expressions was performed through a corresponding number of cards picturing child's emotional facial expressions. Preliminary results successfully validate the proposed robot's non-verbal communication.
引用
收藏
页码:1627 / 1637
页数:11
相关论文
共 50 条
[21]   The Effect of Multiple Robot Interaction on Human-Robot Interaction [J].
Yang, Jeong-Yean ;
Kwon, Dong-Soo .
2012 9TH INTERNATIONAL CONFERENCE ON UBIQUITOUS ROBOTS AND AMBIENT INTELLIGENCE (URAL), 2012, :30-33
[22]   Precision Timing in Human-Robot Interaction: Coordination of Head Movement and Utterance [J].
Yamazaki, Akiko ;
Yamazaki, Keiichi ;
Kuno, Yoshinori ;
Burdelski, Matthew ;
Kawashima, Michie ;
Kuzuoka, Hideaki .
CHI 2008: 26TH ANNUAL CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS VOLS 1 AND 2, CONFERENCE PROCEEDINGS, 2008, :131-139
[23]   Human-Robot Proxemics: Physical and Psychological Distancing in Human-Robot Interaction [J].
Mumm, Jonathan ;
Mutlu, Bilge .
PROCEEDINGS OF THE 6TH ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTIONS (HRI 2011), 2011, :331-338
[24]   Communication in Human-Robot Interaction [J].
Andrea Bonarini .
Current Robotics Reports, 2020, 1 (4) :279-285
[25]   Expressiveness in human-robot interaction [J].
Marti, Patrizia ;
Giusti, Leonardo ;
Pollini, Alessandro ;
Rullo, Alessia .
INTERACTION DESIGN AND ARCHITECTURES, 2008, (5-6) :93-98
[26]   The Science of Human-Robot Interaction [J].
Kiesler, Sara ;
Goodrich, Michael A. .
ACM TRANSACTIONS ON HUMAN-ROBOT INTERACTION, 2018, 7 (01)
[27]   Sound in Human-Robot Interaction [J].
Pelikan, Hannah ;
Robinson, Frederic Anthony ;
Keevallik, Leelo ;
Velonaki, Mari ;
Broth, Mathias ;
Bown, Oliver .
HRI '21: COMPANION OF THE 2021 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION, 2021, :706-708
[28]   Semiotics and human-robot interaction [J].
Sequeira, Joao Silva ;
Ribeiro, Maria Isabel .
ICINCO 2006: PROCEEDINGS OF THE THIRD INTERNATIONAL CONFERENCE ON INFORMATICS IN CONTROL, AUTOMATION AND ROBOTICS: ROBOTICS AND AUTOMATION, 2006, :58-65
[29]   'Food' for Human-Robot Interaction [J].
Kamino, Waki ;
Joshi, Swapna ;
Sabanovic, Selma .
HRI '21: COMPANION OF THE 2021 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION, 2021, :131-135
[30]   Immersive Human-Robot Interaction [J].
Sandygulova, Anara ;
Campbell, Abraham G. ;
Dragone, Mauro ;
O'Hare, G. M. P. .
HRI'12: PROCEEDINGS OF THE SEVENTH ANNUAL ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION, 2012, :227-228