Developing a Dynamic Expression Model That Can Simultaneously Control Robot's Facial and Movement Expressions

被引:0
作者
Park H. [1 ]
Lee J. [2 ]
Dzhoroev T. [2 ]
Kim B. [1 ]
Lee H.S. [2 ]
机构
[1] Department of Creative Design Engineering, Ulsan National Institute of Science and Technology
[2] Department of Design, Ulsan National Institute of Science and Technology
关键词
computational emotion model; human-robot interaction; social robot;
D O I
10.5302/J.ICROS.2024.23.0133
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
Social robots commonly rely on facial expressions and gestures to convey emotions. However, many robots follow a predetermined sequence, executing a set of facial animations and movement sequences once an emotion is identified. This rigid approach can lead to unnatural processing when confronted with additional stimuli during an ongoing emotional expression. This may cause the robot to ignore the new stimulus until the emotion is fully expressed, or to abruptly move on to the next one. To address this limitation, we implemented an emotion engine with a linear dynamic affect-expression model (LDAEM) that calculates the emotion based on stimuli and determines the corresponding facial expression and robot movements. By leveraging the Ekman 6 basic emotions, our emotion engine incorporates 12 control points (CPs) for facial expression and 3 CPs for movement. Experimental results demonstrate the dynamic adaptation of emotions to stimuli. Notably, our approach allows for smooth transitions between emotions, even when different emotional stimuli are introduced during an ongoing emotional expression. Moreover, it can be seamlessly applied to other robotic systems, offering a versatile framework for emotional expression. © ICROS 2024.
引用
收藏
页码:8 / 12
页数:4
相关论文
共 17 条
[1]  
Park H., Lee J., Lee H.S., Development of a robot emotion expression system based on facial expressions and movements using an emotion engine, Proc. of the 38Th ICROS Annual Conference in Korean, (2023)
[2]  
Birdwhistell R.L., Communication without words, Ekistics, 25, 151, (1968)
[3]  
Venture G., Robot expressive motions: A survey of generation and evaluation methods, ACM Transactions on Human-Robot Interaction, 8, 4, (2019)
[4]  
Kleinsmith A., Bianchi-Berthouze N., Affective body expression perception and recognition: A survey, IEEE Transactions on Affective Computing, 4, 1, pp. 15-33, (2013)
[5]  
Bartneck C., Kanda T., Mubin O., Mahmud A.A., Does the design of a robot influence its animacy and perceived intelligence?, International Journal of Social Robotics, 1, 2, pp. 195-204, (2009)
[6]  
Castro-Gonzalez A., Admoni H., Scassellati B., Effects of form and motion on judgments of social robots׳animacy, likability, trustworthiness and unpleasantness, International Journal of Human-Computer Studies, 90, pp. 27-38, (2016)
[7]  
Lee H.S., Park J., Chung M.J., A linear affect– expression space model and control points for mascot-type facial robots, IEEE Transactions on Robotics, 23, 5, pp. 863-873, (2007)
[8]  
Ekman P., Friesen V., Constants across cultures in the face and emotion, Journal of Personality and Social Psychology, 17, 2, pp. 124-129, (1971)
[9]  
Miwa H., Okuchi T., Itoh K., Takanobu H., Takanishi A., A new mental model for humanoid robots for human friendly communication introduction of learning system, mood vector and second order equations of emotion, Proc. of 2003 IEEE International Conference on Robotics and Automation (Cat. No.03Ch37422), pp. 3588-3593, (2003)
[10]  
Lee J., Park H., Dzhoroev T., Kim B., Lee H.S., The implementation and analysis of facial expression customization for a social robot, Journal of Korea Robotics Society, 18, 2, pp. 203-215, (2023)