Using human gaze in few-shot imitation learning for robot manipulation

被引:1
作者
Hamano, Shogo [1 ]
Kim, Heecheol [1 ]
Ohmura, Yoshiyuki [1 ]
Kuniyoshi, Yasuo [1 ]
机构
[1] Univ Tokyo, Grad Sch Informat Sci & Technol, Lab Intelligent Syst & Informat, Bunkyo Ku, 7-3-1 Hongo, Tokyo, Japan
来源
2022 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS) | 2022年
关键词
Imitation Learning; Deep Learning in Grasping and Manipulation; Few-shot Learning; Meta-learning; Telerobotics and Teleoperation;
D O I
10.1109/IROS47612.2022.9981706
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Imitation learning has attracted attention as a method for realizing complex robot control without programmed robot behavior. Meta-imitation learning has been proposed to solve the high cost of data collection and low generalizability to new tasks that imitation learning suffers from. Meta-imitation can learn new tasks involving unknown objects from a small amount of data by learning multiple tasks during training. However, meta-imitation learning, especially using images, is still vulnerable to changes in the background, which occupies a large portion of the input image. This study introduces a human gaze into meta-imitation learning-based robot control. We created a model with model-agnostic meta-learning to predict the gaze position from the image by measuring the gaze with an eye tracker in the head-mounted display. Using images around the predicted gaze position as an input makes the model robust to changes in visual information. We experimentally verified the performance of the proposed method through picking tasks using a simulated robot. The results indicate that our proposed method has a greater ability than the conventional method to learn a new task from only 9 demonstrations even if the object's color or the background pattern changes between the training and test.
引用
收藏
页码:8622 / 8629
页数:8
相关论文
共 50 条
  • [1] Survey on Few-shot Learning
    Zhao K.-L.
    Jin X.-L.
    Wang Y.-Z.
    Ruan Jian Xue Bao/Journal of Software, 2021, 32 (02): : 349 - 369
  • [2] Memory-based gaze prediction in deep imitation learning for robot manipulation
    Kim, Heecheol
    Ohmura, Yoshiyuki
    Kuniyoshi, Yasuo
    2022 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA 2022), 2022, : 2427 - 2433
  • [3] Prototype Completion for Few-Shot Learning
    Zhang, Baoquan
    Li, Xutao
    Ye, Yunming
    Feng, Shanshan
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (10) : 12250 - 12268
  • [4] Few-Shot Learning With a Strong Teacher
    Ye, Han-Jia
    Ming, Lu
    Zhan, De-Chuan
    Chao, Wei-Lun
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2024, 46 (03) : 1425 - 1440
  • [5] Few-Shot Learning for Defence and Security
    Robinson, Todd
    ARTIFICIAL INTELLIGENCE AND MACHINE LEARNING FOR MULTI-DOMAIN OPERATIONS APPLICATIONS II, 2020, 11413
  • [6] Exploring Quantization in Few-Shot Learning
    Wang, Meiqi
    Xue, Ruixin
    Lin, Jun
    Wang, Zhongfeng
    2020 18TH IEEE INTERNATIONAL NEW CIRCUITS AND SYSTEMS CONFERENCE (NEWCAS'20), 2020, : 279 - 282
  • [7] Few-shot learning for ear recognition
    Zhang, Jie
    Yu, Wen
    Yang, Xudong
    Deng, Fang
    PROCEEDINGS OF 2019 INTERNATIONAL CONFERENCE ON IMAGE, VIDEO AND SIGNAL PROCESSING (IVSP 2019), 2019, : 50 - 54
  • [8] Few-Shot Classification with Contrastive Learning
    Yang, Zhanyuan
    Wang, Jinghua
    Zhu, Yingying
    COMPUTER VISION, ECCV 2022, PT XX, 2022, 13680 : 293 - 309
  • [9] Prototype Reinforcement for Few-Shot Learning
    Xu, Liheng
    Xie, Qian
    Jiang, Baoqing
    Zhang, Jiashuo
    2020 CHINESE AUTOMATION CONGRESS (CAC 2020), 2020, : 4912 - 4916
  • [10] An Applicative Survey on Few-shot Learning
    Zhang J.
    Zhang X.
    Lv L.
    Di Y.
    Chen W.
    Recent Patents on Engineering, 2022, 16 (05) : 104 - 124