A gaze-based virtual keyboard using a mouth switch for command selection

被引:0
作者
Soundarajan, S. [1 ]
Cecotti, H. [1 ]
机构
[1] Fresno State Univ, Dept Comp Sci, Coll Sci & Math, Fresno, CA 93740 USA
来源
2018 40TH ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY (EMBC) | 2018年
关键词
D O I
暂无
中图分类号
R318 [生物医学工程];
学科分类号
0831 ;
摘要
Portable eye-trackers provide an efficient way to access the point of gaze from a user on a computer screen. Thanks to eyetracking, gaze-based virtual keyboard can be developed by taking into account constraints related to the gaze detection accuracy. In this paper, we propose a new gaze-based virtual keyboard where all the letters can be accessed directly through a single command. In addition, we propose a USB mouth switch that is directly connected through a computer mouse, with the mouse switch replacing the left click button. This approach is considered to tackle the Midas touch problem with eye-tracking for people who are severely disabled. The performance is evaluated on 10 participants by comparing the following three conditions: gaze detection with mouth switch, gaze detection with dwell time by considering the distance to the closest command, and the gaze detection within the surface of the command box. Finally, a workload using NASA-TLX test was conducted on the different conditions. The results revealed that the proposed approach with the mouth switch provides a better performance in terms of typing speed (36.6 +/- 8.4 letters/minute) compared to the other conditions, and a high acceptance as an input device.
引用
收藏
页码:3334 / 3337
页数:4
相关论文
共 50 条
[1]   A multiscript gaze-based assistive virtual keyboard [J].
Cecotti, H. ;
Meena, Y. K. ;
Bhushan, B. ;
Dutta, A. ;
Prasad, G. .
2019 41ST ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY (EMBC), 2019, :1306-1309
[2]   Command Selection in Gaze-based See-through Virtual Image-Guided Environments [J].
Afkari, Hoorieh ;
Perez, David Gil de Gomez ;
Bednarik, Roman .
AUGMENTED HUMAN 2018: PROCEEDINGS OF THE 9TH AUGMENTED HUMAN INTERNATIONAL CONFERENCE, 2018,
[3]   Gaze-based Interaction for Virtual Environments [J].
Jimenez, Jorge ;
Gutierrez, Diego ;
Latorre, Pedro .
JOURNAL OF UNIVERSAL COMPUTER SCIENCE, 2008, 14 (19) :3085-3098
[4]   An Adaptive Model of Gaze-based Selection [J].
Chen, Xiuli ;
Acharya, Aditya ;
Oulasvirta, Antti ;
Howes, Andrew .
CHI '21: PROCEEDINGS OF THE 2021 CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS, 2021,
[5]   Gaze-based Kinaesthetic Interaction for Virtual Reality [J].
Li, Zhenxing ;
Akkil, Deepak ;
Raisamo, Roope .
INTERACTING WITH COMPUTERS, 2020, 32 (01) :17-32
[6]   Investigating shared attention with a virtual agent using a gaze-based interface [J].
Peters, Christopher ;
Asteriadis, Stylianos ;
Karpouzis, Kostas .
JOURNAL ON MULTIMODAL USER INTERFACES, 2010, 3 (1-2) :119-130
[7]   Investigating shared attention with a virtual agent using a gaze-based interface [J].
Christopher Peters ;
Stylianos Asteriadis ;
Kostas Karpouzis .
Journal on Multimodal User Interfaces, 2010, 3 :119-130
[8]   Suggesting Gaze-based Selection for Surveillance Applications [J].
Hild, Jutta ;
Peinsipp-Byma, Elisabeth ;
Voit, Michael ;
Beyerer, Juergen .
2019 16TH IEEE INTERNATIONAL CONFERENCE ON ADVANCED VIDEO AND SIGNAL BASED SURVEILLANCE (AVSS), 2019,
[9]   Gaze-Based Interaction Intention Recognition in Virtual Reality [J].
Chen, Xiao-Lin ;
Hou, Wen-Jun .
ELECTRONICS, 2022, 11 (10)
[10]   Exploring Gaze-Based Menu Navigation in Virtual Environments [J].
Kopacsi, Laszlo ;
Barz, Michael ;
Klimenko, Albert ;
Sonntag, Daniel .
PROCEEDINGS OF THE 2024 ACM SYMPOSIUM ON SPATIAL USER INTERACTION, SUI 2024, 2024,