Touch-text answer for human-robot interaction via supervised adversarial learning

被引:4
|
作者
Wang, Ya-Xin [1 ]
Meng, Qing-Hao [1 ]
Li, Yun-Kai [2 ]
Hou, Hui-Rang [1 ]
机构
[1] Tianjin Univ, Inst Robot & Autonomous Syst, Sch Elect & Informat Engn, Tianjin 300072, Peoples R China
[2] Zhengzhou Univ, Sch Elect & Informat Engn, Zhengzhou 450001, Henan, Peoples R China
基金
中国国家自然科学基金; 中国博士后科学基金;
关键词
Human-robot interaction; Cross-modal retrieval; Adversarial learning; Touch gesture; Text;
D O I
10.1016/j.eswa.2023.122738
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In daily life, touch modality plays an important role in conveying human intentions and emotions. To further improve touch-based human-robot interaction, robots need to infer human emotions from touch signals and respond accordingly. Therefore, it is a major challenge to correlate the emotional state of touch gestures with text responses. At present, there are few researches on touch-text dialogue, and robots cannot respond to human tactile gestures with appropriate text, so touch-text-based human-robot interaction is not yet possible. To solve these problems, we first built a touch-text dialogue (TTD) corpus based on six basic emotions through experiments, which contains 1109 touch-text sample pairs. And then, we designed a supervised adversarial learning for touch-text answer (SATTA) model to realize the touch-text based human-robot interaction. The SATTA model correlates the data of text mode with that of touch mode by reducing the emotion discrimination loss in the public space and the feature difference between the sample pairs of two modes. At the same time, the feature representation is mapped into the label space to reduce the classification loss of samples. The experiment in the TTD corpus validates the proposed method.
引用
收藏
页数:10
相关论文
共 50 条
  • [1] Supervised autonomy for online learning in human-robot interaction
    Senft, Emmanuel
    Baxter, Paul
    Kennedy, James
    Lemaignan, Severin
    Belpaeme, Tony
    PATTERN RECOGNITION LETTERS, 2017, 99 : 77 - 86
  • [2] Asynchronous federated learning system for human-robot touch interaction
    Gamboa-Montero, Juan Jose
    Alonso-Martin, Fernando
    Marques-Villarroya, Sara
    Sequeira, Joao
    Salichs, Miguel A.
    EXPERT SYSTEMS WITH APPLICATIONS, 2023, 211
  • [3] Recognizing Touch Gestures for Social Human-Robot Interaction
    Altuglu, Tugce Balli
    Altun, Kerem
    ICMI'15: PROCEEDINGS OF THE 2015 ACM INTERNATIONAL CONFERENCE ON MULTIMODAL INTERACTION, 2015, : 407 - 413
  • [4] Affective Touch in Human-Robot Interaction: Conveying Emotion to the Nao Robot
    Andreasson, Rebecca
    Alenljung, Beatrice
    Billing, Erik
    Lowe, Robert
    INTERNATIONAL JOURNAL OF SOCIAL ROBOTICS, 2018, 10 (04) : 473 - 491
  • [5] Human-Robot Interaction via a Joint-Initiative Supervised Autonomy (JISA) Framework
    Abbas Sidaoui
    Naseem Daher
    Daniel Asmar
    Journal of Intelligent & Robotic Systems, 2022, 104
  • [6] Advances in Tactile Sensing and Touch based Human-Robot Interaction
    Mastrogiovanni, Fulvio
    Cannata, Giorgio
    Natale, Lorenzo
    Metta, Giorgio
    HRI'12: PROCEEDINGS OF THE SEVENTH ANNUAL ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION, 2012, : 489 - 489
  • [7] Human-Robot Interaction via a Joint-Initiative Supervised Autonomy (JISA) Framework
    Sidaoui, Abbas
    Daher, Naseem
    Asmar, Daniel
    JOURNAL OF INTELLIGENT & ROBOTIC SYSTEMS, 2022, 104 (03)
  • [8] Incremental learning of gestures for human-robot interaction
    Okada, Shogo
    Kobayashi, Yoichi
    Ishibashi, Satoshi
    Nishida, Toyoaki
    AI & SOCIETY, 2010, 25 (02) : 155 - 168
  • [9] Learning and Comfort in Human-Robot Interaction: A Review
    Wang, Weitian
    Chen, Yi
    Li, Rui
    Jia, Yunyi
    APPLIED SCIENCES-BASEL, 2019, 9 (23):
  • [10] A survey of communicating robot learning during human-robot interaction
    Habibian, Soheil
    Valdivia, Antonio Alvarez
    Blumenschein, Laura H.
    Losey, Dylan P.
    INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, 2025, 44 (04) : 665 - 698