A Gaze-Speech System in Mixed Reality for Human-Robot Interaction

被引:1
作者
Prada, John David Prieto [1 ]
Lee, Myung Ho [1 ]
Song, Cheol [1 ]
机构
[1] DGIST, Dept Robot & Mechatron Engn, Daegu 42988, South Korea
来源
2023 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA 2023) | 2023年
关键词
D O I
10.1109/ICRA48891.2023.10161010
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Human-robot interaction (HRI) demands efficient time performance along the tasks. However, some interaction approaches may extend the time to complete such tasks. Thus, the time performance in HRI must be enhanced. This work presents an effective way to enhance the time performance in HRI tasks with a mixed reality (MR) method based on a gaze-speech system. In this paper, we design an MR world for pick-and-place tasks. The hardware system includes an MR headset, the Baxter robot, a table, and six cubes. In addition, the holographic MR scenario offers two modes of interaction: gesture mode (GM) and gaze-speech mode (GSM). The input actions during the GM and GSM methods are based on the pinch gesture and gaze with speech commands, respectively. The proposed GSM approach can improve the time performance in pick-and-place scenarios. The GSM system is 21.33 % faster than the traditional system, GM. Also, we evaluated the target-to-target time performance against a reference based on Fitts' law. Our findings show a promising method for time reduction in HRI tasks through MR environments.
引用
收藏
页码:7547 / 7553
页数:7
相关论文
共 19 条
  • [1] Spatial Computing and Intuitive Interaction: Bringing Mixed Reality and Robotics Together
    Delmerico, Jeffrey
    Poranne, Roi
    Bogo, Federica
    Oleynikova, Helen
    Vollenweider, Eric
    Coros, Stelian
    Nieto, Juan
    Pollefeys, Marc
    [J]. IEEE ROBOTICS & AUTOMATION MAGAZINE, 2022, 29 (01) : 45 - 57
  • [2] Review on Human-Robot Interaction During Collaboration in a Shared Workspace
    Galin, Rinat
    Meshcheryakov, Roman
    [J]. INTERACTIVE COLLABORATIVE ROBOTICS (ICR 2019), 2019, 11659 : 63 - 74
  • [3] Galin RR, 2020, STUD SYST DECIS CONT, V272, P55, DOI 10.1007/978-3-030-37841-7_5
  • [4] Krupke D, 2018, IEEE INT C INT ROBOT, P5003, DOI 10.1109/IROS.2018.8594043
  • [5] Survey of Human-Robot Collaboration in Industrial Settings: Awareness, Intelligence, and Compliance
    Kumar, Shitij
    Savur, Celal
    Sahin, Ferat
    [J]. IEEE TRANSACTIONS ON SYSTEMS MAN CYBERNETICS-SYSTEMS, 2021, 51 (01): : 280 - 297
  • [6] Sustainable Human-Robot Collaboration Based on Human Intention Classification
    Lin, Chiuhsiang Joe
    Lukodono, Rio Prasetyo
    [J]. SUSTAINABILITY, 2021, 13 (11)
  • [7] Finger-Based Pointing Performance on Mobile Touchscreen Devices: Fitts' Law Fits
    Ljubic, Sandi
    Glavinic, Vlado
    Kukec, Mihael
    [J]. UNIVERSAL ACCESS IN HUMAN-COMPUTER INTERACTION: ACCESS TO TODAY'S TECHNOLOGIES, PT I, 2015, 9175 : 318 - 329
  • [8] MacKenzie I. S., 1995, Readings in Human-Computer Interaction, P483
  • [9] Towards Robots able to Measure in Real-time the Quality of Interaction in HRI Contexts
    Mayima, Amandine
    Clodic, Aurelie
    Alami, Rachid
    [J]. INTERNATIONAL JOURNAL OF SOCIAL ROBOTICS, 2022, 14 (03) : 713 - 731
  • [10] Estimation of Subjective Evaluation of HRI Performance Based on Objective Behaviors of Human and Robots
    Mizuchi, Yoshiaki
    Inamura, Tetsunari
    [J]. ROBOT WORLD CUP XXIII, ROBOCUP 2019, 2019, 11531 : 201 - 212