A multiscript gaze-based assistive virtual keyboard

被引:0
作者
Cecotti, H. [1 ]
Meena, Y. K. [2 ]
Bhushan, B. [2 ]
Dutta, A. [2 ]
Prasad, G. [3 ]
机构
[1] Fresno State Univ, Coll Sci & Math, Dept Comp Sci, Fresno, CA 93740 USA
[2] Indian Inst Technol IIT, Ctr Mechatron, Dept Humanities & Social Sci, Kanpur, Uttar Pradesh, India
[3] Ulster Univ, Intelligent Syst Res Ctr, Coleraine, Londonderry, North Ireland
来源
2019 41ST ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY (EMBC) | 2019年
关键词
D O I
10.1109/embc.2019.8856446
中图分类号
R318 [生物医学工程];
学科分类号
0831 ;
摘要
The recent development of inexpensive and accurate eye-trackers allows the creation of gazed based virtual keyboards that can be used by a large population of disabled people in developing countries. Thanks to eye-tracking technology, gaze-based virtual keyboards can be designed in relation to constraints related to the gaze detection accuracy and the considered display device. In this paper, we propose a new multimodal multiscript gaze-based virtual keyboard where it is possible to change the layout of the graphical user interface in relation to the script. Traditionally, virtual keyboards are assessed for a single language (e.g. English). We propose a multiscript gaze based virtual keyboard that can be accessed for people who communicate with the Latin, Bangla, and/or Devanagari scripts. We evaluate the performance of the virtual keyboard with two main groups of participants: 28 people who can communicate with both Bangla and English, and 24 people who can communicate with both Devanagari and English. The performance is assessed in relation to the information transfer rate when participants had to spell a sentence using their gaze for pointing to the command, and a dedicated mouth switch for commands selection. The results support the conclusion that the system is efficient, with no difference in terms of information transfer rate between Bangla and Devanagari. However, the performance is higher with English, despite the fact it was the secondary language of the participants.
引用
收藏
页码:1306 / 1309
页数:4
相关论文
共 14 条
[1]   TongueWise: Tongue-Computer Interface Software for People with Tetraplegia [J].
Caltenco, Hector A. ;
Struijk, Lotte N. S. Andreasen ;
Breidegard, Bjorn .
2010 ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY (EMBC), 2010, :4534-4537
[2]  
Cecotti H, 2018, IEEE ENG MED BIO, P3330, DOI 10.1109/EMBC.2018.8512909
[3]   A Multimodal Gaze-Controlled Virtual Keyboard [J].
Cecotti, Hubert .
IEEE TRANSACTIONS ON HUMAN-MACHINE SYSTEMS, 2016, 46 (04) :601-606
[4]   A Self-Paced and Calibration-Less SSVEP-Based Brain-Computer Interface Speller [J].
Cecotti, Hubert .
IEEE TRANSACTIONS ON NEURAL SYSTEMS AND REHABILITATION ENGINEERING, 2010, 18 (02) :127-133
[5]  
Gordon Richard G., 2005, Ethnologue languages of the world, V15th edn.
[6]   Employment after paraplegia in India: a postal survey [J].
Gupta, N. ;
Solomon, J. ;
Raja, K. .
SPINAL CORD, 2011, 49 (07) :806-811
[7]   The Changing Face of Augmentative and Alternative Communication: Past, Present, and Future Challenges [J].
Light, Janice ;
McNaughton, David .
AUGMENTATIVE AND ALTERNATIVE COMMUNICATION, 2012, 28 (04) :197-204
[8]  
Lupu R., 2011, P IEEE 15 ICSTCC OCT, P1
[9]  
Meena Y. K., 2019, J MULTIMODAL USER IN, P1
[10]  
Meena Y. K., 2016, P IEEE INT C SYST MA, P1