See-Thru: Towards Minimally Obstructive Eye-Controlled Wheelchair Interfaces

被引:9
作者
Singer, Corten [1 ]
Hartmann, Bjorn [2 ]
机构
[1] Univ Calif Irvine, Irvine, CA 92697 USA
[2] Univ Calif Berkeley, Berkeley, CA 94720 USA
来源
ASSETS'19: THE 21ST INTERNATIONAL ACM SIGACCESS CONFERENCE ON COMPUTERS AND ACCESSIBILITY | 2019年
关键词
Eye Gaze; Eye Tracking; Gaze Control; User Interfaces; Power Wheelchair; Navigation; Field of View (FOV); GAZE; MOVEMENTS; TRACKING;
D O I
10.1145/3308561.3353802
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Eye-tracking interfaces increase the communication bandwidth between humans and computers when using hands is not possible. For some, eyes are the only available input modality to control and interact with the various devices that enable their independence. The goal of this work is to develop and evaluate an eye-controlled wheelchair navigation interface that minimizes obstruction to the user's field of view by removing the conventional use of a computer screen as a feedback mechanism. We present See-Thru, an eye-tracking interface that provides feedback to the user without a screen while simultaneously providing a clear view of the path ahead. Our prototype is evaluated against a screen-based state of the art interface in a study with three navigation tasks completed by seven power wheelchair users. Our results show that a majority of the participants not only prefer using the See-Thru interface, but perform better at driving tasks when using it. This supports the notion that users favor minimally obstructive interfaces in navigational contexts.
引用
收藏
页码:459 / 469
页数:11
相关论文
共 23 条
[1]  
Ahmed Z., 2010, THESIS
[2]   Electro-oculographic guidance of a wheelchair using eye movements codification [J].
Barea, R ;
Boquete, L ;
Bergasa, LM ;
López, E ;
Mazo, M .
INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, 2003, 22 (7-8) :641-652
[3]  
Breuninger J., 2011, P 1 INT WORKSH PERV
[4]   Eye tracking in the wild [J].
Hansen, DW ;
Pece, AEC .
COMPUTER VISION AND IMAGE UNDERSTANDING, 2005, 98 (01) :155-181
[5]   HUMAN-COMPUTER INTERACTION USING EYE-GAZE INPUT [J].
HUTCHINSON, TE ;
WHITE, KP ;
MARTIN, WN ;
REICHERT, KC ;
FREY, LA .
IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS, 1989, 19 (06) :1527-1534
[6]  
Jacob R. J., 1995, VIRTUAL ENV ADV INTE
[8]   Pupil: An Open Source Platform for Pervasive Eye Tracking and Mobile Gaze-based Interaction [J].
Kassner, Moritz ;
Patera, William ;
Bulling, Andreas .
PROCEEDINGS OF THE 2014 ACM INTERNATIONAL JOINT CONFERENCE ON PERVASIVE AND UBIQUITOUS COMPUTING (UBICOMP'14 ADJUNCT), 2014, :1151-1160
[9]   GAZE AND EYE CONTACT - A RESEARCH REVIEW [J].
KLEINKE, CL .
PSYCHOLOGICAL BULLETIN, 1986, 100 (01) :78-100
[10]  
Kobayashi Yoshinori, 2011, CHI 11 EXTENDED ABST, P2239, DOI 10