Command Selection in Gaze-based See-through Virtual Image-Guided Environments

被引:4
作者
Afkari, Hoorieh [1 ]
Perez, David Gil de Gomez [1 ]
Bednarik, Roman [1 ]
机构
[1] Univ Eastern Finland, Sch Comp, Joensuu, Finland
来源
AUGMENTED HUMAN 2018: PROCEEDINGS OF THE 9TH AUGMENTED HUMAN INTERNATIONAL CONFERENCE | 2018年
关键词
Gaze interaction; VR; surgical image-guided techniques; gaze-based interaction; surgical microscope; EYE; MICROSCOPE;
D O I
10.1145/3174910.3174940
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Embedded close-to-the-eye gaze tracking permits new types of interaction in see-through augmented and virtual environments. It is however unclear how gaze-input can be used to select and confirm commands when the tracking technologies are located at close proximity to user's eyes, but cannot utilize fixed geometry as in screen-based environments. We conducted a study in a simulated image-guided medical environment where users employed gaze-input to control an on-screen display. The current hand-based interaction of such views is a frequent source of interruption and thus feasibility of alternative input modalities has to be evaluated. We created a three-stage gaze-based confirmation mechanism and evaluated its robustness and the limits of the target size. Two sizes of the target for command selection were evaluated, occupying 12 and 6 degrees of visual angle at the 30cm distance. The results show the time to perform an action using gaze input is shorter than in hand-based interaction with the real-world device, confirming that this input modality is feasible. The size of target has little effect on the interaction and the completion-error is low. The findings have implications on the design of future gaze-based input methods for these devices.
引用
收藏
页数:8
相关论文
共 36 条
[1]  
Afkari H, 2014, 8 NORD C HUM COMP IN
[2]  
[Anonymous], 1987, ACM SIGCHI B, DOI DOI 10.1145/29933.275627
[3]  
Bednarik R, 2009, J EYE MOVEMENT RES, V3
[4]  
Bolt RA., 1982, PROC ACM CHI 82, P360, DOI DOI 10.1145/800049.801811
[5]  
Brown-Clerk B. J., 2010, P HUM FACT ERG SOC A, V54, P1785
[6]  
CHARLIER J, 1991, DEV OPHTHALMOL, V22, P154
[7]   Understanding the impact of multimodal interaction using gaze informed mid-air gesture control in 3D virtual objects manipulation [J].
Deng, Shujie ;
Jiang, Nan ;
Chang, Jian ;
Guo, Shihui ;
Zhang, Jian J. .
INTERNATIONAL JOURNAL OF HUMAN-COMPUTER STUDIES, 2017, 105 :68-80
[8]   Embedding an Eye Tracker Into a Surgical Microscope: Requirements, Design, and Implementation [J].
Eivazi, Shahram ;
Bednarik, Roman ;
Leinonen, Ville ;
Fraunberg, Mikael von Und Zu ;
Jaaskelainen, Juha E. .
IEEE SENSORS JOURNAL, 2016, 16 (07) :2070-2078
[9]  
Eivazi Shahram, 2015, ACTA NEUROCHIR, P1
[10]  
Hor F, 1998, MINIMALLY INVASIVE TECHNIQUES FOR NEUROSURGERY, P175