Symmetric Evaluation of Multimodal Human-Robot Interaction with Gaze and Standard Control

被引:3
|
作者
Jones, Ethan R. [1 ]
Chinthammit, Winyu [1 ]
Huang, Weidong [2 ]
Engelke, Ulrich [3 ]
Lueg, Christopher [1 ]
机构
[1] Univ Tasmania, Sch Technol Environm & Design, Hobart, Tas 7005, Australia
[2] Swinburne Univ Technol, Sch Software & Elect Engn, Hawthorn, Vic 3122, Australia
[3] CSIRO, Kensington, WA 6020, Australia
来源
SYMMETRY-BASEL | 2018年 / 10卷 / 12期
关键词
multimodal interaction; eye tracking; empirical evaluation; human-robot interaction; COGNITIVE LOAD; TRACKING; DESIGN;
D O I
10.3390/sym10120680
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
Control of robot arms is often required in engineering and can be performed by using different methods. This study examined and symmetrically compared the use of a controller, eye gaze tracker and a combination thereof in a multimodal setup for control of a robot arm. Tasks of different complexities were defined and twenty participants completed an experiment using these interaction modalities to solve the tasks. More specifically, there were three tasks: the first was to navigate a chess piece from a square to another pre-specified square; the second was the same as the first task, but required more moves to complete; and the third task was to move multiple pieces to reach a solution to a pre-defined arrangement of the pieces. Further, while gaze control has the potential to be more intuitive than a hand controller, it suffers from limitations with regard to spatial accuracy and target selection. The multimodal setup aimed to mitigate the weaknesses of the eye gaze tracker, creating a superior system without simply relying on the controller. The experiment shows that the multimodal setup improves performance over the eye gaze tracker alone (p < 0.05) and was competitive with the controller only setup, although did not outperform it (p > 0.05).
引用
收藏
页数:15
相关论文
共 50 条
  • [31] Multimodal Fusion as Communicative Acts during Human-Robot Interaction
    Alonso-Martin, Fernando
    Gorostiza, Javier F.
    Malfaz, Maria
    Salichs, Miguel A.
    CYBERNETICS AND SYSTEMS, 2013, 44 (08) : 681 - 703
  • [32] Evaluations of embedded Modules dedicated to multimodal Human-Robot Interaction
    Burger, Brice
    Lerasle, Frederic
    Ferrane, Isabelle
    RO-MAN 2009: THE 18TH IEEE INTERNATIONAL SYMPOSIUM ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION, VOLS 1 AND 2, 2009, : 341 - +
  • [33] Multimodal Approach to Affective Human-Robot Interaction Design with Children
    Okita, Sandra Y.
    Ng-Thow-Hing, Victor
    Sarvadevabhatla, Ravi K.
    ACM TRANSACTIONS ON INTERACTIVE INTELLIGENT SYSTEMS, 2011, 1 (01)
  • [34] Speech to Head Gesture Mapping in Multimodal Human-Robot Interaction
    Aly, Amir
    Tapus, Adriana
    SERVICE ORIENTATION IN HOLONIC AND MULTI-AGENT MANUFACTURING CONTROL, 2012, 402 : 183 - 196
  • [35] Research on multimodal human-robot interaction based on speech and gesture
    Deng Yongda
    Li Fang
    Xin Huang
    COMPUTERS & ELECTRICAL ENGINEERING, 2018, 72 : 443 - 454
  • [36] Action Alignment from Gaze Cues in Human-Human and Human-Robot Interaction
    Duarte, Nuno Ferreira
    Rakovic, Mirko
    Marques, Jorge
    Santos-Victor, Jose
    COMPUTER VISION - ECCV 2018 WORKSHOPS, PT III, 2019, 11131 : 197 - 212
  • [37] Towards Real-time Probabilistic Evaluation of Situation Awareness from Human Gaze in Human-Robot Interaction
    Paletta, Lucas
    Dini, Amir
    Murko, Cornelia
    Yahyanejad, Saeed
    Schwarz, Michael
    Lodron, Gerald
    Ladstaetter, Stefan
    Paar, Gerhard
    Velik, Rosemarie
    COMPANION OF THE 2017 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION (HRI'17), 2017, : 247 - 248
  • [38] Admittance control for physical human-robot interaction
    Keemink, Arvid Q. L.
    van der Kooij, Herman
    Stienen, Arno H. A.
    INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, 2018, 37 (11) : 1421 - 1444
  • [39] Semantic Gaze Labeling for Human-Robot Shared Manipulation
    Aronson, Reuben M.
    Admoni, Henny
    ETRA 2019: 2019 ACM SYMPOSIUM ON EYE TRACKING RESEARCH & APPLICATIONS, 2019,
  • [40] Simplified Human-Robot Interaction: Modeling and Evaluation
    Daniel, Balazs
    Thomessen, Trygve
    Korondi, Peter
    MODELING IDENTIFICATION AND CONTROL, 2013, 34 (04) : 199 - 211