Body Language in Affective Human-Robot Interaction

被引:12
|
作者
Stoeva, Darja [1 ]
Gelautz, Margrit [1 ]
机构
[1] TU Wien, Inst Visual Comp & Human Ctr Technol, Vienna, Austria
来源
HRI'20: COMPANION OF THE 2020 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION | 2020年
关键词
human-robot interaction; body language; robotics; affective computing; social signals; EMOTION; RECOGNITION; PERCEPTION;
D O I
10.1145/3371382.3377432
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Social human-robot interaction is concerned with exploring the ways in which social interaction can be achieved between a human and a sociable robot. Affect has an important role in interaction as it helps interactants coordinate and indicate the success of the communication. Designing socially intelligent robots requires competence in communication, which includes exchanges of both verbal and non-verbal cues. This project will focus on non-verbal communication, more specifically body movements, postures and gestures as means of conveying socially affective information. Using the affective grounding perspective, which conceptualizes emotion as a coordination mechanism, together with honest signals as a measurement of the dynamics of the interaction, and the robot Pepper, we aim to develop a system that would be able to communicate affect, with the goal to enhance affective human-robot interaction.
引用
收藏
页码:606 / 608
页数:3
相关论文
共 50 条
  • [21] A Body Emotion-Based Human-Robot Interaction
    Zhu, Tehao
    Zhao, Qunfei
    Xiong, Jing
    COMPUTER VISION SYSTEMS, ICVS 2017, 2017, 10528 : 268 - 278
  • [22] Body Movement Mirroring and Synchrony in Human-Robot Interaction
    Stoeva, Darja
    Kriegler, Andreas
    Gelautz, Margrit
    ACM TRANSACTIONS ON HUMAN-ROBOT INTERACTION, 2024, 13 (04)
  • [23] Human-Robot Interaction by Understanding Upper Body Gestures
    Xiao, Yang
    Zhang, Zhijun
    Beck, Aryel
    Yuan, Junsong
    Thalmann, Daniel
    PRESENCE-VIRTUAL AND AUGMENTED REALITY, 2014, 23 (02): : 133 - 154
  • [24] Upper Body Gesture Recognition for Human-Robot Interaction
    Oh, Chi-Min
    Islam, Md Zahidul
    Lee, Jun-Sung
    Lee, Chil-Woo
    Kweon, In-So
    HUMAN-COMPUTER INTERACTION: INTERACTION TECHNIQUES AND ENVIRONMENTS, PT II, 2011, 6762 : 294 - 303
  • [25] HUMAN AFFECTIVE STATE RECOGNITION AND CLASSIFICATION DURING HUMAN-ROBOT INTERACTION SCENARIOS
    Zhang, Zhe
    Nejat, Glodie
    PROCEEDINGS OF THE ASME INTERNATIONAL DESIGN ENGINEERING TECHNICAL CONFERENCES AND COMPUTERS AND INFORMATION IN ENGINEERING CONFERENCE, VOL 3, 2010, : 435 - 441
  • [26] Design of a Soft Upper Body Robot for Physical Human-Robot Interaction
    Alspach, Alexander
    Kim, Joohyung
    Yamane, Katsu
    2015 IEEE-RAS 15TH INTERNATIONAL CONFERENCE ON HUMANOID ROBOTS (HUMANOIDS), 2015, : 290 - 296
  • [27] Human-Robot Interaction
    Jia, Yunyi
    Zhang, Biao
    Li, Miao
    King, Brady
    Meghdari, Ali
    JOURNAL OF ROBOTICS, 2018, 2018
  • [28] The Affective Loop: A Tool for Autonomous and Adaptive Emotional Human-Robot Interaction
    Vircikova, Maria
    Magyar, Gergely
    Sincak, Peter
    ROBOT INTELLIGENCE TECHNOLOGY ANDAPPLICATIONS 3, 2015, 345 : 247 - 254
  • [29] EMOGIB: Emotional Gibberish Speech Database for Affective Human-Robot Interaction
    Yilmazyildiz, Selma
    Henderickx, David
    Vanderborght, Bram
    Verhelst, Werner
    Soetens, Eric
    Lefeber, Dirk
    AFFECTIVE COMPUTING AND INTELLIGENT INTERACTION, PT II, 2011, 6975 : 163 - 172
  • [30] Human-Robot Interaction
    Sidobre, Daniel
    Broquere, Xavier
    Mainprice, Jim
    Burattini, Ernesto
    Finzi, Alberto
    Rossi, Silvia
    Staffa, Mariacarla
    ADVANCED BIMANUAL MANIPULATION: RESULTS FROM THE DEXMART PROJECT, 2012, 80 : 123 - +