Eye Tracking Interaction on Unmodified Mobile VR Headsets Using the Selfie Camera

被引:11
作者
Drakopoulos, Panagiotis [1 ]
Koulieris, George-alex [2 ]
Mania, Katerina [1 ]
机构
[1] Tech Univ Crete, Sch Elect & Comp Engn, Univ Campus Kounoupidiana, Khania 73100, Greece
[2] Univ Durham, Math Sci & Comp Sci Bldg, Upper Mountjoy Campus,Stockton Rd, Durham DH1 3LE, England
关键词
Mobile VR; eye tracking; ROBUST PUPIL DETECTION; GAZE;
D O I
10.1145/3456875
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
Input methods for interaction in smartphone-based virtual and mixed reality (VR/MR) are currently based on uncomfortable head tracking controlling a pointer on the screen. User fixations are a fast and natural input method for VR/MR interaction. Previously, eye tracking in mobile VR suffered from low accuracy, long processing time, and the need for hardware add-ons such as anti-reflective lens coating and infrared emitters. We present an innovative mobile VR eye tracking methodology utilizing only the eye images from the front-facing (selfie) camera through the headset's lens, without any modifications. Our system first enhances the low-contrast, poorly lit eye images by applying a pipeline of customised low-level image enhancements suppressing obtrusive lens reflections. We then propose an iris region-of-interest detection algorithm that is run only once. This increases the iris tracking speed by reducing the iris search space in mobile devices. We iteratively fit a customised geometric model to the iris to refine its coordinates. We display a thin bezel of light at the top edge of the screen for constant illumination. A confidence metric calculates the probability of successful iris detection. Calibration and linear gaze mapping between the estimated iris centroid and physical pixels on the screen results in low latency, real-time iris tracking. A formal study confirmed that our system's accuracy is similar to eye trackers in commercial VR headsets in the central part of the headset's field-of-view. In a VR game, gaze-driven user completion time was as fast as with head-tracked interaction, without the need for consecutive head motions. In a VR panorama viewer, users could successfully switch between panoramas using gaze.
引用
收藏
页数:20
相关论文
共 50 条
  • [41] High Accuracy Eye Tracking on Mobile Devices Using Haar Cascade Classifier and Connect Region Detection
    Cheng, Da-wei
    An, Ming-yang
    Yu, Wei-qian
    Fang, Liang
    2013 INTERNATIONAL CONFERENCE ON COMPUTER SCIENCE AND ARTIFICIAL INTELLIGENCE (ICCSAI 2013), 2013, : 423 - 427
  • [42] NeuroSight: Combining Eye-Tracking and Brain-Computer Interfaces for Context-Aware Hand-Free Camera Interaction
    Leung, Benedict
    Shimabukuro, Mariana
    Collins, Christopher
    PROCEEDINGS OF THE 37TH ANNUAL ACM SYMPOSIUM ON USER INTERFACE SOFTWARE AND TECHNOLOGY, UIST ADJUNCT 2024, 2024,
  • [43] Quantifying the costs of interruption during diagnostic radiology interpretation using mobile eye-tracking glasses
    Drew, Trafton
    Williams, Lauren H.
    Aldred, Booth
    Heilbrun, Marta E.
    Minoshima, Satoshi
    JOURNAL OF MEDICAL IMAGING, 2018, 5 (03)
  • [44] Game-Based Social Interaction Platform for Cognitive Assessment of Autism Using Eye Tracking
    Chien, Yi-Ling
    Lee, Chia-Hsin
    Chiu, Yen-Nan
    Tsai, Wen-Che
    Min, Yuan-Che
    Lin, Yang-Min
    Wong, Jui-Shen
    Tseng, Yi-Li
    IEEE TRANSACTIONS ON NEURAL SYSTEMS AND REHABILITATION ENGINEERING, 2023, 31 : 749 - 758
  • [45] Using an eye-tracking system to improve camera motions and depth-of-field blur effects in virtual environments
    Hillaire, Sebastien
    Lecuyer, Anatole
    Cozot, Remi
    Casiez, Gery
    IEEE VIRTUAL REALITY 2008, PROCEEDINGS, 2008, : 47 - +
  • [46] Mobile Eye-Tracking Data Analysis Using Object Detection via YOLO v4
    Kumari, Niharika
    Ruf, Verena
    Mukhametov, Sergey
    Schmidt, Albrecht
    Kuhn, Jochen
    Kuechemann, Stefan
    SENSORS, 2021, 21 (22)
  • [47] Using synchronized eye and motion tracking to determine high-precision eye-movement patterns during object-interaction tasks
    Lavoie, Ewen B.
    Valevicius, Aida M.
    Boser, Quinn A.
    Kovic, Ognjen
    Vette, Albert H.
    Pilarski, Patrick M.
    Hebert, Jacqueline S.
    Chapman, Craig S.
    JOURNAL OF VISION, 2018, 18 (06): : 1 - 20
  • [48] Automatic Visual Attention Detection for Mobile Eye Tracking Using Pre-Trained Computer Vision Models and Human Gaze
    Barz, Michael
    Sonntag, Daniel
    SENSORS, 2021, 21 (12)
  • [49] Access to learning materials using different interaction devices: a comparative study based on eye tracking and subjective perception of students
    Molina, Ana I.
    Redondo, Miguel A.
    Lacave, Carmen
    Ortega, Manuel
    2012 INTERNATIONAL SYMPOSIUM ON COMPUTERS IN EDUCATION (SIIE), 2012,
  • [50] Swipe versus multiple view: a comprehensive analysis using eye-tracking to evaluate user interaction with web maps
    Popelka, Stanislav
    Burian, Jaroslav
    Beitlova, Marketa
    CARTOGRAPHY AND GEOGRAPHIC INFORMATION SCIENCE, 2022, 49 (03) : 252 - 270