Symmetric Evaluation of Multimodal Human-Robot Interaction with Gaze and Standard Control

被引:3
|
作者
Jones, Ethan R. [1 ]
Chinthammit, Winyu [1 ]
Huang, Weidong [2 ]
Engelke, Ulrich [3 ]
Lueg, Christopher [1 ]
机构
[1] Univ Tasmania, Sch Technol Environm & Design, Hobart, Tas 7005, Australia
[2] Swinburne Univ Technol, Sch Software & Elect Engn, Hawthorn, Vic 3122, Australia
[3] CSIRO, Kensington, WA 6020, Australia
来源
SYMMETRY-BASEL | 2018年 / 10卷 / 12期
关键词
multimodal interaction; eye tracking; empirical evaluation; human-robot interaction; COGNITIVE LOAD; TRACKING; DESIGN;
D O I
10.3390/sym10120680
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
Control of robot arms is often required in engineering and can be performed by using different methods. This study examined and symmetrically compared the use of a controller, eye gaze tracker and a combination thereof in a multimodal setup for control of a robot arm. Tasks of different complexities were defined and twenty participants completed an experiment using these interaction modalities to solve the tasks. More specifically, there were three tasks: the first was to navigate a chess piece from a square to another pre-specified square; the second was the same as the first task, but required more moves to complete; and the third task was to move multiple pieces to reach a solution to a pre-defined arrangement of the pieces. Further, while gaze control has the potential to be more intuitive than a hand controller, it suffers from limitations with regard to spatial accuracy and target selection. The multimodal setup aimed to mitigate the weaknesses of the eye gaze tracker, creating a superior system without simply relying on the controller. The experiment shows that the multimodal setup improves performance over the eye gaze tracker alone (p < 0.05) and was competitive with the controller only setup, although did not outperform it (p > 0.05).
引用
收藏
页数:15
相关论文
共 50 条
  • [41] Neural Control for Human-Robot Interaction with Human Motion Intention Estimation
    Peng, Guangzhu
    Yang, Chenguang
    Chen, C. L. Philip
    IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS, 2024, 71 (12) : 16317 - 16326
  • [42] Natural Grasp Intention Recognition Based on Gaze in Human-Robot Interaction
    Yang, Bo
    Huang, Jian
    Chen, Xinxing
    Li, Xiaolong
    Hasegawa, Yasuhisa
    IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS, 2023, 27 (04) : 2059 - 2070
  • [43] Implementing Human-Robot Interaction Evaluation Using Standard Test Methods for Response Robots
    Norton, Adam
    Flynn, Brian
    Yanco, Holly
    HOMELAND SECURITY AND PUBLIC SAFETY: RESEARCH, APPLICATIONS, AND STANDARDS, 2019, 1614 : 63 - 90
  • [44] Evolutionary Motion Control Optimization in Physical Human-Robot Interaction
    Nadeau, Nicholas A.
    Bonev, Ilian A.
    2018 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2018, : 1347 - 1353
  • [45] A Taxonomy of Robot Autonomy for Human-Robot Interaction
    Kim, Stephanie
    Anthis, Jacy Reese
    Sebo, Sarah
    PROCEEDINGS OF THE 2024 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION, HRI 2024, 2024, : 381 - 393
  • [46] Intelligent Control Architecture for Human-Robot Interaction
    Alves, Silas F. R.
    Silva, Ivan N.
    Ferasoli Filho, Humberto
    2014 2ND BRAZILIAN ROBOTICS SYMPOSIUM (SBR) / 11TH LATIN AMERICAN ROBOTICS SYMPOSIUM (LARS) / 6TH ROBOCONTROL WORKSHOP ON APPLIED ROBOTICS AND AUTOMATION, 2014, : 259 - 264
  • [47] Information Transfer Within Human Robot Teams: Multimodal Attention Management in Human-Robot Interaction
    Mortimer, Bruce J. P.
    Elliott, Linda R.
    2017 IEEE CONFERENCE ON COGNITIVE AND COMPUTATIONAL ASPECTS OF SITUATION MANAGEMENT (COGSIMA), 2017,
  • [48] FORCE/POSITION CONTROL OF ROBOT MANIPULATOR FOR HUMAN-ROBOT INTERACTION
    Neranon, Paramin
    Bicker, Robert
    THERMAL SCIENCE, 2016, 20 : S537 - S548
  • [49] Neural network based reinforcement learning for audio-visual gaze control in human-robot interaction
    Lathuiliere, Stephane
    Masse, Benoit
    Mesejo, Pablo
    Horaud, Radu
    PATTERN RECOGNITION LETTERS, 2019, 118 (61-71) : 61 - 71
  • [50] Human-robot interaction
    Murphy R.R.
    Nomura T.
    Billard A.
    Burke J.L.
    IEEE Robotics and Automation Magazine, 2010, 17 (02) : 85 - 89